Many articles1, journals2, books3, and other sources have discussed the continued and expansive nature of vehicles that are driven by computers as opposed to human beings.4 We have written and presented many programs on driverless vehicles (“DV”).5 Many of the materials mentioned as to DV have focused on the technological aspects of such vehicles.
Some of our articles6, too, have directed the reader’s attention to the stages of technological development for DV. This article is not directed to such discoveries on DV. Rather, the DV field must be examined in a multitude of areas. One of those fields that has not been examined in much detail covers the issue of ethical concerns with DV and their impact on real estate.
Another and related discussions cover the legal concerns in the real estate field when DV are employed. This article is intended to examine the many issues that are being raised and that should be raised as to legal matters and ethical issues for real estate owners, investors, and businesses if DV are employed in our society.
DV—What Are They?
When we use the term in this article of “DV,” we are including the myriad of terms that describe a vehicle that does not have a human being in the vehicle to direct the same. Thus, this DV term includes, for our purposes herein, AI (artificial intelligence) vehicle, SDV (self-driven vehicle), and other terms meeting this definitional standard of a vehicle that is directed be it by a computer inside or outside of the vehicle, and where there is no driver within the vehicle.7
There are many uses already engaging DV that are being employed throughout the world.8 Such uses of DV may be employed when operating the standard personal car, to other uses, e.g., farming9, shuttle services10, boats11, trucks12, mining13, flying14, and other instances where a vehicle is employed to accomplish various functions for mankind. Most of the discussion on ethics and legal issues in this Note will apply to most applications of DV that are related to the use of real estate, be that of the traditional automobile employed in business or some of the lesser-known uses of the DV, such as in farming operations, e.g., harvesting.
Legal Issues and DV
We often hear the comment that legal issues permeate all that we do in life. This is probably true. For example, inviting a guest to your home or business would not normally be a focus of legal concerns. However, what if the guest chokes on something the guest ate? Is the host responsible for this? (Yes, yes, we need to know more facts!). What we do know is that there may be a legal issue relative to this event. Likewise, if the guest tripped on the carpet, is the host responsible? (Again, yes, we need to know more facts such as whether the carpet was torn, worn, etc.) What we could agree is that such an event could generate legal questions as to liability.
If we assume the premise that there are legal issues connected with most events and behavior, we might apply this same approach to the DV setting.
However, this position might be a fortiori with DV, since it is recognized as new technology. Further, there are typically laws in each state, country, province, etc. that might specifically require certain actions relative to the DV. One of the more common laws in this setting is a direction under the law that the vehicle must have a driver15. Surely, when this law was passed, most legislators probably did not object to the same. The need for a driver was obvious. Then! Now, we need to modify our laws to allow the DV to be operated without a driver.
One might think this change is somewhat akin to the law that said in the real property area that one owns real estate to the center of the earth and as high as one can go. Such a law, again, seemed quite acceptable to lawmakers when it was passed. However, as we know, we have had to “adjust” our laws with the advent of technology, such as airplanes, drones, etc. Our legislators must be busy at work seeing the changes that must take place, such as providing for a vehicle to be run without a driver.
The DV without a driver is a truism — one would argue. However, many laws in many jurisdictions have not modified the prior statutes and rules that state that a driver must be present in the vehicle, even if the vehicle is designed to not require a driver.
However, a more difficult legal issue about DV might entail the concern with who is liable, if anyone, in the event there is an accident involving a DV.16 If shopping center owners “invite” shoppers to a mall, where the design is for a DV to drop off a shopper, is there any liability to the passenger/shopper, DV owner, mall owner, mall manager, or others that are involved in this setting?
Historically, we would answer that the one driving, who was at fault, such as failing to stop at a stop sign or red light, should be liable for the injuries from an accident that occurred because of such driving.17 But should there be no driver, who then is liable?18 This issue is being addressed in many countries. It is not limited to the USA.19
Do we say the owner of the vehicle is liable? If so, how will such an owner provide protection? Would that owner gain insurance? Would insurance companies insure such DV? What if the accident was not caused by the owner of the vehicle that was involved in the accident? That is, for example, what if the DV was programmed to avoid accidents and a small child stepped into the street? If the DV swerved to avoid the child and the DV hit your car, would the owner of the DV be responsible? What if the DV #2 was damaged or someone was injured? Is the owner of the DV #1 responsible? Is the property owner, where the accident took place, liable?
Attorneys, especially attorneys dealing with litigation involving accidents, need to address these issues. Possibly the law would say no one is responsible. Possibly there will be a fund created that compensates parties that are involved in such accidents.20 There is one such proposal, referenced as the Automated Vehicles Act, which would if enacted, apply in the UK.21
The funding of a reserve might be created by a tax that requires monies paid to the fund when one buys or operates a DV. There are many approaches to providing such funds. However, the point at this stage is to address the issue of possible liability and recognize that there are legal concerns that must be considered when a DV comes into play.
When one engages a DV to move them from point A to point B, should such travelers have any liability because they engaged the vehicle?
How will insurance change because of the use of DV as opposed to privately owned vehicles?22 It has been suggested that DV will substantially change the insurance industry, since what they are insuring, the vehicle, and others connected with it will be substantially changed when it enters the DV range.23
What happens during the transition, when there are some privately owned driver vehicles operating and some DV operating? Are there different laws applicable in this setting? There are also DV ethical issues which can overlap with these legal issues, in some instances.
DV and Ethical Issues
Having established that there are legal issues that can arise when operating DV, what are some of the ethical issues that can arise with DV?
Suppose we examine the earlier example of the child that stepped out in front of a DV. The DV might, as noted, swerve and hit someone else.
The question arises, with no driver in the vehicle, what would cause the DV to swerve? If we expand on the earlier example and assume that the first DV, DV #1, was programmed to avoid hitting a living object, such as the child, and the DV swerves from the child and hits a building, one might say that the action was “right.” That is, in our society, and in most societies, the value of human life is more important than an inanimate object, such as the building.24 (We might add, as a side note, that there is now an additional legal issue in this setting. The programming for the DV might have been designed to avoid humans. However, in so doing the same, there is damage to the building. Should the owner of the DV be liable for the damage to the building? We could, of course, discuss, again, who might be liable in this setting. However, after noting this legal issue, we should stay with the subject at hand, viz., the ethical issues.)
An ethical concern might be with the “proper” or “ethical” approach that should be undertaken in the programming.25 Should the programmer — or those instructing the programmer — design the software to be certain that the child is not hit in the above case? Of course, this is the conclusion stated above. But take it to the next step. What if we assume the DV was faced with turning away from hitting the child, but the option this time was to swerve and hit 2 pedestrians? What if the pedestrians are older?
What if there is a third option: The DV could be programmed to hit a wall, avoiding hitting the child and avoiding hitting the other 2 pedestrians? However, in hitting the wall the occupant of the DV will be killed. (Is the owner of the real estate, the “wall,” liable?)
Ethically, and possibly legally, how should a DV be programmed in this type of setting, and who should make the decision as to the programming (e.g. the owner of the DV, the operator of the DV, the manufacturer of the DV, the government that permitted the DV on the road, the owner of the real estate where the DV operated, etc.)?26
If, for example, the owner of the DV is to make these decisions, must the owner of the DV, who structures the programming for the DV, anticipate the various situations that the DV will face? If the owner of the DV does not provide programming for the actions of the DV, is the owner responsible?
These life-or-death decisions, trying to decide where the vehicle, a car, a trolley, or another machine, crashes into one person or others, continue to be an important issue that needs to be resolved when thinking of the programming that will take place with the DV.
In considering these situations, where a choice must be made as to who is injured or who dies, many often refer to the well-known “Trolley” problem or cases. One article discussing this area noted the DV (labeled in this note as “AV”) stated:
It has not escaped notice that some accident scenarios bear a resemblance to what are known in philosophy as “trolley cases.” These are imagined scenarios in which a runaway trolley will result in the death of some number of individuals unless a choice is made to divert or other-wise alter the trolley’s course, resulting in some other number of deaths. In the classic trolley case, the trolley is headed down a track and will kill five people that can’t escape. A bystander can pull a switch and divert the trolley onto another track. However, on this track there is one person who cannot escape and will die if the trolley is diverted. A similar scenario involves an AV that is traveling down a street when suddenly a group of pedestrians runs into the street. The only way to avoid hitting them is to take a turn that will result in the death of a pedestrian on the sidewalk. In another version of the trolley case, a trolley cannot stop and will kill five people unless an object of sufficient weight is pushed in front of it. A bystander has the option of pushing a large person off a bridge and onto the tracks in a way that would stop the train before it kills the five. Again, a case involving an AV might have a similar structure: Perhaps an empty AV has gone out of control and will hit five pedestrians unless another AV with a single passenger drives itself into the first AV.27
In another article addressing this trolley issue, author Brown stated:
That’s where philosophers and ethicists have long brought up one of the foundational issues facing an autonomous-driving future. It’s known as the “trolley problem,” and it basically boils down to this: How do you teach a car to make complex, life-or-death decisions in seemingly lose-lose scenarios on the road? And if cars can’t do this, would you trust them to carry your child to school or your parent to a doctor’s appointment
Autonomous-driving researchers say the industry is far from deciding how AI chooses who receives the brunt force of an oncoming accident.28
The above-noted author, Brown, commented further on this issue by stating:
There have been attempts to quantify human perspectives on AI-driving decisions. One method came from researchers at the Massachusetts Institute of Technology (MIT), who conducted a global study to find a consensus approach to the trolley problem.
The trolley problem is a decades-old moral dilemma where someone must fictionally decide whether to steer a trolley towards one person to save a larger group of people. Generally, people think AI’s responsibility is to spare as many lives as possible, even if that means killing a few, according to an experiment published in 2018.29
This “Trolley” problem remains an issue with DV.
There are other settings in which decisions must be made that involve the lives of many people in a DV setting. If these decisions are made in connection with a DV on the real property of a given owner, such as in an office complex, an industrial complex, a shopping center, etc., is there any responsibility to the owner of the real estate that permits the use of DV on the property of the owner? Will an owner of such real estate resist the use of DV on the premises of the property owner?
What if one, riding in a DV, becomes very ill and the DV is programmed to take the rider to the hospital? Can and should the DV be allowed to speed to the hospital, endangering others, but potentially helping the ill occupant in the DV? Is this what a human driver would do in this setting, and therefore, is such an action acceptable?
There are many moral issues related to DV. One examination of many of these issues was conducted under what is known as “The Moral Machine” experiment. The summary of this experiment dealing with artificial intelligence, in general, was stated by the authors as:
With the rapid development of artificial intelligence have come concerns about how machines will make moral decisions, and the major challenge of quantifying societal expectations about the ethical principles that should guide machine behaviour. To address this challenge, we deployed the Moral Machine, an online experimental platform designed to explore the moral dilemmas faced by autonomous vehicles. This platform gathered 40 million decisions in ten languages from millions of people in 233 countries and territories. Here we describe the results of this experiment. First, we summarize global moral preferences. Second, we document individual variations in preferences, based on respondents’ demographics. Third, we report cross-cultural ethical variation, and uncover three major clusters of countries. Fourth, we show that these differences correlate with modern institutions and deep cultural traits. We discuss how these preferences can contribute to developing global, socially acceptable principles for machine ethics.30
There are many other situations we can envision using DV that can raise legal and ethical issues. Suffice to say that such issues exist and those dealing with DV must be positioned to engage and solve these issues — hopefully, before they arise.31
Creative destruction exists.32
There will be those that will lose employment because of DV.
There will be those that will gain from the DV surge. Property owners employing DV shuttles may substantially reduce the costs of transporting customers, etc. However, everyone will face, with the advent of DV, ethical and legal issues.
This Note is an attempt to further the discussion as to these two areas of ethics and law as they interplay with the introduction into society of the DV, especially those connected with employing DV relative to the use of real estate.
As mentioned, there are many, many additional issues that must be addressed because of the use of DV. By taking only a part of these issues, ethics and law, it is hoped that there can be greater focus on these areas to try and address solutions to the many issues that have already arisen in these areas. What one will quickly observe is that attempts to address ethical and legal issues will result in overlap with many of the other issues impacting DV. For example, notwithstanding the technology that has developed, the technology is sometimes hampered by legal concerns, such as the earlier mentioned concern that many older statutes or laws have provided that a driver must always be in the moving vehicle. With the advance of technology, we have now seen that many manufacturers are removing the steering wheel and have no “driver’s seat.” Thus, the technology has moved to this driverless position, yet the laws in some locations have not been modified to allow such position. As recently stated in an article by Kayla Matthews:
“In 2017, a total of 33 states introduced driverless vehicle legislation as opposed to 20 states the year prior. So, it’s clearly ramping up. Today, 29 states have existing laws that deal with autonomous vehicles and their systems. But federal action is another concern, outside of local and state legislators. The National Highway and Transportation Safety Administration released new guidelines for automated driving systems in September of this year. The Senate is also working on introducing autonomous vehicle legislation.”34
Again, the point is that areas of concern overlap with each other in many instances. This overlap of the legal requirement for a driver and the technological development to remove the steering wheel (because there is no driver in the vehicle) is a wonderful example of this concern.
We can be certain that much more will be developing in this area of DV as to ethical and legal issues, including these issues when addressing real property interests.