INFO 1101 Unit 1 Paper
.docx
keyboard_arrow_up
School
University of Colorado, Boulder *
*We aren’t endorsed by this school
Course
1101
Subject
Philosophy
Date
Feb 20, 2024
Type
docx
Pages
3
Uploaded by maddie6638
Critical Thinking
#4. There are many benefits and problems with AVs when it comes to each ethical framework. The benefits of looking through a Utilitarian lens for AV decision-making are that they benefit the
majority and minimize accidents and harm. For example, drunk drivers or bad and unsafe drivers in general do not have the opportunity to hurt anyone, therefore maximizing benefits for society. With safe AVs, accidents are minimized which saves lives. Some problems looking through a Utilitarian lens would be that there is no control over safety as an AV user. If the greater good is to protect pedestrians and let the driver and passengers in the AV die, it would be a gamble to drive one everyday knowing that one’s safety will almost always be at risk. Not only would it be unsafe, but having to sacrifice anyone can do mental harm. If one was driving an AV and got into a collision where the AV chose to prioritize the driver over the pedestrian, that may lead to guilt. The benefits of using Deontology for AV decision-making would be that there is a clear and strict set of rules to follow and follows moral duties. “You have an imperative to act with duty, obligation, and oath and it’s fairly strict” (Carruth, 2023)
. With Utilitarianism, AVs would have to choose who to save and it’s not always predictable, but with Deontology, the choices that the AV makes would be predictable. There is only one right answer. There are many issues with this approach. Like Utilitarianism, some situations are unpredictable making it hard for the AV to
give a clear answer. It also doesn’t allow for any personal judgment and the rules aren’t super flexible. The real-world road is too complex for a single list of strict rules; therefore, it wouldn’t be the safest to use Deontology for AV decision-making.
The benefits of using Virtue Ethics for AV decision-making would be that it would make the AV a
moral and responsible vehicle. With these qualities, people may be more willing to use AVs and there will be a sense of security. It also would allow the connection of human moral values with the vehicle. With a more humanistic approach to decision-making, this might fill the hole that’s present in the other ethical frameworks of navigating complex scenarios. According to Aristotle, by honing virtuous habits, people will likely make the right choice when faced with ethical challenges” (McCombs School of Business, 2018).
The issue with applying Virtue Ethics to AV decision-making is the fact that virtuous habits and characteristics are subjective. Some may
think a characteristic is virtuous while others may not. The algorithm would be extremely difficult
because there are no specifics.
In my opinion, none of these ethical frameworks would be great for AV decision-making, but if I had to choose for practicality, I would choose Deontology because it is the most predictable. It may be difficult to implement the strict rules, but with those strict rules, you would know what to expect from the AV. It also is not subjective like Utilitarianism and Virtue Ethics making it a better universal option.
#4a. From a personal standpoint, I think that none of these ethical frameworks work very well for
AV decision-making because it all really depends on your personal views. I personally have views that mix all 3 of these ethical theories together in a unique way which is probably also true
for others. Using just one can be problematic, but using multiple can also be problematic, so
there’s no winning. While looking at AVs through the different frameworks, there is way too much room for disaster. Overall, I think applying any ethical framework to AV decision-making is
problematic because everyone is different. It is not possible to not use ethical frameworks for decision-making, but it also does not seem possible for an AV to ever be able to comprehend situations us humans do; therefore, I think AVs in general will always be dangerous to everyone no matter the ethical framework. Keeping an open mind, Utilitarianism would be the ethical framework I choose for AV decision-making regarding my personal views, however, it does not consider practicality, legal requirements, or public response. I would choose Utilitarianism because I think saving more people is the most moral. Prioritizing people is effective, and I will take anything that causes less harm.
Pseudocode + Reflection
Rule #1 IF the PEDESTRIANS include a pregnant woman and the PASSENGERS do not,
THEN save the PEDESTRIANS
ELSE IF PEDESTRIANS are young,
THEN save the PEDESTRIANS
ELSE IF the PEDESTRIANS include a doctor,
THEN save the PEDESTRIANS
ELSE IF the PASSENGERS are elderly,
THEN save the PEDESTRIANS
ELSE IF the PEDESTRIANS are all humans,
THEN save the PEDESTRIANS
ELSE IF the PEDESTRIANS are more than the PASSENGERS,
THEN save the PEDESTRIANS
ELSE save the PASSENGERS
Rule #2
IF the PEDESTRIANS are male and PASSENGERS are female,
THEN save the PASSENGERS
ELSE IF the PASSENGERS include a criminal,
THEN save the PASSENGERS
ELSE IF the PASSENGERS include a baby,
THEN save the PASSENGERS
ELSE IF the PASSENGERS include an elderly male,
THEN save the PASSENGERS
ELSE IF the PASSENGERS are human,
THEN save the PASSENGERS
ELSE save the PEDESTRIANS
The first pseudocode rule is in alignment with Utilitarianism because it prioritizes the greater good as well as happiness. The overall point of the pseudocode is to save more human lives and save those who benefit us as a society. It prioritizes those who are young and those who
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help