• Keine Ergebnisse gefunden

Care Levels and Automation

We proceed stepwise by first analyzing the model without activity levels.32The so-cial planner’s objective is to define the soso-cially optimal care level in order to minimize

32By maintaining Shavell’s (1980; 1987) traditional assumptions under which the cost of care and expected harm are proportional to activity level, the results of this section do not change when activity levels are included in the model (see Section 3.5).

the total social costs:

minx S=x+a+p(f)L (3.4.1)

The following first-order condition defines the socially optimal care standard ¯x (omit-ting arguments):

1+βpfL=0 (3.4.2)

Likewise, the adoption of automated safety technology is socially desirable up to the point at which the marginal cost of automation equals the total marginal benefit ¯a:

1+pfax+γ) =0 (3.4.3)

By considering the relationship between the socially optimal level of care and the level of activity automation, we obtain the following results:

Proposition 3.4.1. As automation levels increase, the efficient due care standard de-creases.

Corollary 3.4.2. In the limiting case of fully-automated activities, the standard of due-care is zero, a Negligence regime effectively becomes ade factoNo Liability regime.

Proof. See Appendix 3.7.

According to Proposition 3.4.1, the efficient negligence standard should be tailored

to automation levels. The practical implementability of this proposition requires courts to set due-care standards contingent upon the level of automation, and to establish the causal relationship between autonomous devices and accidents. This result is intuitive.

For example, consider a self-parking car hitting a pedestrian. Intuitively, if the ac-cident happened while parking, courts should apply a lower due-care standard to the car operator since the system was expected to safely operate the vehicle without hu-man input. In this case, the concept of reasonable person used to evaluate a negligent behavior should consider, among others, the specific circumstances of the case, the adopted automation level. Yet if instead the accident occurred while the human oper-ator was driving, the fact that the vehicle is provided with automated-parking devices should not affect the definition of the due-care standard.

A higherγmeans a higher effectiveness of automation devices in reducing expected accident costs, thus requiring a lower standard of care for a given automation level. The effectiveness factorγcan be also interpreted as the development status of an automated technology. Under this interpretation, for a given automation level, the standard of negligence should optimally evolve over time to keep pace with the development status of a given automated technology.

Let us now consider the private incentives to invest in care and automation under the rule of negligence. The private cost function of the injurer is given as:

minx T =

⎧⎪

⎪⎪

⎪⎪

⎪⎩

x+a+p(f)L ifx<x¯ x+a ifx≥x¯

(3.4.4)

The following first-order condition defines the privately optimal care levelx(omitting arguments):

1+βpfL=0 (3.4.5)

Proposition 3.4.3. Under tailored negligence standards, injurers will always have in-centives to comply with the care standard. Under non-tailored negligence standards, the privately optimal care level falls below the due-care standard as investments in automation increase. This might induce potential injurers either to exercise excessive and inefficient care, or to not adhere to the non-tailored standard, preferring to exer-cise privately optimal care and be considered negligent in case of an accident.

Corollary 3.4.4. Injurers have more incentives to invest in automation under tailored negligence standards rather than under non-tailored negligence standards.

Proof. See Appendix 3.7.

We should remark here on an interesting legal implication. A phenomenon which has developed gradually over the past several decades is the migration of activities from the domain of common law torts to regulatory law. Whereas automobile acci-dents are frequently given as exemplars of tort law, the reality is that most automobile torts are entirely determined by statutory obligations. Assuming that automated tech-nologies will tend, at least during the developmental stage of a product, to exhibit a large degree of heterogeneity, and further assuming that it would be impractical for

legislatures to enact statutory standards to cover every possible permutation of tech-nological improvement as they are released to the public, the effect of a “tailored”

negligence standard is to return tort law to its origins. Rather than assessing whether a defendant’s conduct satisfied some statutorily determined standard, the tailored ap-proach requires judges and juries to decide cases on γ < pLor “reasonable person”

bases.

Standards of negligence tailored to the level of automation underlying the injurer’s activity allow us to fully exploit the potential benefits of automated technologies. Con-sider self-driving cars. If the standard of negligence remains unchanged with respect to automation level, the operator of a self-driving car could be held liable for an acci-dent if he was not monitoring the road ahead while autonomous devices were driving the vehicle. In this case, the operator of a driverless car could have incentives either to always monitor the actions of automated devices, thus nullifying their function, or to not adopt safer automated devices in order to maintain a direct control on the ve-hicle. On the contrary, tailored standards are aligned with the purposes of automated technologies, allowing a driver to be distracted while automated devices are operating the vehicle, without be threaten by liability issues. As a consequence, under tailored due-care standards potential injurers have incentives to adopt safer, automated tech-nologies especially for risky actions, and to prove their actual usage in courts to avoid full liability costs.