People in Illinois who follow the development of autonomous cars may have heard about a fatality involving a pedestrian in March. According to one professor at the University of Arizona, the state where the accident occurred, the problem with the programming of autonomous cars is that they are being programmed to drive like humans. He says this results in the cars making the same types of errors human drivers would.

Video from the deadly accident shows the pedestrian stepping out into the road where there are no lights and no pedestrian crosswalk. According to the professor, the error was that the car, like humans, was driving through an area with the assumption that there were no obstacles in the way despite being unable to visually confirm this. He says that a self-driving car should only travel at a speed at which it can stop for obstacles that are within its ability to detect. The car should have been proceeding as though there were obstacles beyond that area instead of as though there were not.

The professor also pointed out that while an accident in which a human driver injures or kills another person is tragic, such an accident caused by an autonomous car may also destroy the industry. His own research focuses on guaranteed reactions by computer systems.

Predictions are that autonomous vehicles will still be significantly safer than human drivers. However, it will be some time before they are widespread, and in the meantime, human drivers will continue to cause car accidents because of driving while fatigued, distracted driving and driving under the influence of alcohol or drugs. When a driver causes an accident that injures others, the driver’s insurance company is supposed to compensate the injured, but the offer of compensation might be insufficient. It may be necessary to file a lawsuit in order to obtain sufficient compensation.