activating self driving mode

Autonomous vehicles have been in the news recently.  Trains, trucks, and even airplanes are driving or flying without human help.  Testing on self-driving cars has been extensive.  They are sure to be a commercial reality sooner rather than later.

self driving car in traffic
Self driving car in traffic. Image credit: Sundry Photography / Shutterstock.com

But this does raise some questions.  Unsurprisingly many concern motoring laws.  Just how will autonomous vehicles affect them?

  • Will a ‘driver’ of a fully autonomous car need insurance?
  • Who will be responsible in the event of an accident? The driver or the vehicle manufacturer?
  • Will the use of self-driving cars be restricted in any way? Banned from roads near schools or the fast lane of a motorway for example.
  • How will the laws differentiate between human drivers and semi or fully autonomous vehicles?
  • Can you be drunk in charge of a self-driving car?
  • How will autonomous vehicles affect road safety?

The biggest question of all?  When an autonomous vehicle is operating who is ultimately responsible for it and its actions?  The owner / driver or the manufacturer of the autonomous driving system?

Laws drafted for human drivers need to be adapted and updated.  Quickly.  The reality of commercially available self-driving cars is hurtling towards us.

The basic problem is existing motoring laws require a driver to have full control of the vehicle.  Plainly this is not the case with self-driving cars.  New laws need to reflect this.

woman in self driving car

The Aussies are taking the lead

In Australia they expect a commercial rollout of self-driving cars by 2020.

The National Transport Commission (NTC) are contemplating changes in the law.  They have asked for evidence from the police, automotive industry, and other experts on how they believe self-driving vehicles will impact the transport system.

In a statement Chief Executive of the NTC Paul Retter said:  “The introduction of more automated vehicles will see elements of the driving task shift away from the human driver to the automated driving system but our laws currently don’t recognise these systems.

ebuyer daily deals 40% off

“We need to ensure that relevant driving laws apply to automated vehicles when the automated driving system—rather than the human driver—is operating the vehicle.

“We have been tasked with identifying, and if necessary, removing, legislative impediments to automated vehicles. But we must also maintain the intent of existing laws—to ensure the safe operation of vehicles on Australian roads.

“Legislation must recognise a legal entity that can be held responsible for the automated driving system.”

self driving car on city street

The NTC have also released a discussion paper.  It makes interesting reading.

The report states there are eight major issues for lawmakers to consider:

  1. Current driving laws and offences assume a human driver
  2. An ADS is not a person and cannot be legally responsible for its actions
  3. Current law does not provide for a legal entity, which we describe as an automateddriving system entity (ADSE), to be held responsible for the actions of the ADSS
  4. Some legislative duties and obligations given to drivers could not be controlled by the ADSE if an ADS is the driver
  5. Safety duties may need to be carried out by someone else if the driver is an ADS
  6. ‘Control’ and ‘proper control’ of a vehicle are not defined if an ADS is driving
  7. There are no legal obligations on a human who may be required to take over the driving task (fallback-ready user) to ensure he or she is alert and ready to do so.
  8. Current compliance and enforcement measures may not be suitable to ensure the safe operation of an ADS

The paper and the consultation process is primarily concerned with pinpointing responsibility.  And they seem to be leaning towards making the maker of the autonomous driving system responsible rather than the driver.

young man in autonomous car

In the conclusion to their report the NTC make these suggestions for discussion:

  • Existing road traffic penalties are clearly aimed at influencing the behaviour of human drivers — without change, they are unlikely to be appropriate or effective when applied to an automated driving system entity (ADSE)
  • If existing road traffic penalties apply to an ADSE, corporate multipliers are likely to increase the effectiveness of those penalties
  • Breaches of road traffic laws should be taken as evidence of a broader failure to provide safe automated vehicles and as a breach of the primary safety duty or other specific offences included in the safety assurance system
  • A primary safety duty be examined as part of the safety assurance system reforms

So make the ADS manufacturer responsible in law and safety on the roads should improve. And safety is the biggest advantage autonomous vehicles have over human driven cars.  Assuming the ADS works of course.

Why we need self-driving cars

Statistics from the States suggest 97% of car accidents are due to driver error.  By removing the driver from the equation the number of accidents and therefore deaths will plummet.

car crash
Image credit: PongMoji / Shutterstock.com

It all makes sense.

But can the lawmakers really make a system legally responsible?  Surely the same argument has failed before?

If a gun kills someone it is the person who fired the gun who is responsible not the gun maker.  A truck hits a pedestrian it is the drivers fault not the car manufacturer.   A person chokes on an apple it isn’t the fault of the farmer who grew the fruit nor of the shop which sold it.

Motoring laws need to change.  That much is obvious.  But are the Aussies on the right track?

Is the ADS ultimately responsible for the actions of the car it is driving?  Or should the human travelling in the vehicle still bear the ultimate responsibility?  Even if he/she isn’t in control of the car?

Let us know what you think in the comments box below.

ebuyer daily deals 40% off

6 COMMENTS

  1. Considering the number of computers that will be required in a self driving car, there will be plenty of telematics to show precisely who was in charge and therefore assign blame for any accidents or infringements of existing laws. If a car in autonomous mode runs a red light, then the car manufacturer is clearly to blame, if not, then the driver is charged accordingly.

    The biggest concern and worry is that the software programmer who has created the system will have a moral dilemma on their hands.

    An accident is about to happen due to a truck losing control and crosses into the path of our autonomous car. Sensing that a collision is inevitable, the car has three choices.
    a) Stay on a collision course with probable fatalities of both drivers.
    b) Swerve to the left, onto the pavement where there are children playing, with obvious consequences.
    c) Swerve to the right, into other oncoming traffic, again with predictable results.

    That decision has to be calculated into the system, as the system needs to make a split second decision, what would you do?

  2. Hi Craig, you are correct ,the outside lane is the fast lane because clearly you can not overtake at a slower speed, QED

  3. Having written software in the industrial control field I am aware that I would have been held responsible for my work and if an error caused loss of life I would have been in the firing line for blame/financial loss/loss of professional status/loss of personal freedom. How many people working in the field of ANS cars are concerned about the repercussions of a single minute failure in their software/hardware? The clean version of Murphy’s Law, if it can go wrong it will go wrong, will always apply. No amount of Beta testing will eliminate all possible failures. I may end up being taken to my funeral in an all electric hearse but I hope that it will be driven by a human undertaker.

LEAVE A REPLY

Please enter your comment!
Please enter your name here