Script ChatGPT
.docx
keyboard_arrow_up
School
University of Notre Dame *
*We aren’t endorsed by this school
Course
FYS
Subject
Information Systems
Date
Dec 6, 2023
Type
docx
Pages
4
Uploaded by TrentFisher
I will be presenting on the issue of Privacy and Security in ChatGPT. To examine privacy
and
security
in ChatGPT I used my journal. I use my journal for creativity, self-expression,
problem-solving, decision-making, self-discovery, and goal setting. My journal is private.
I do not want people reading my
journal
because it is confidential.
2Before I discuss specifically privacy and security in ChatGPT I would like to connect the
concept to the last two books. In The Age of Genomes: Tales from the Front lines of
Genetic Medicine
. Lipkin discusses the importance
of genetic data where a person’s
information and
data
can be misused and exploited. The author presents that the use of
genetic data presents new challenges like
unauthorized
access and hacking by a third
party. Rules and regulations are in place to ensure the security of genetic data, but there
is always the fear of genetic data being used for nefarious purposes like criminal or
personal gain.
In the second book. The Ethical Algorithm: The science of Socially Aware Algorithm
Design. The authors discuss the importance of user privacy and data security as an
important part when designing an algorithm. They expand that algorithms must only
collect necessary data and
minimize the
collection
of unnecessary or sensitive data.
They discuss specific methods that can be implemented to prevent unauthorized access
to user data or theft and explain that privacy and security should be considered
throughout an algorithm’s entire lifecycle.
4This brings me to my topic privacy and security in ChatGPT which is a theme of
innovation as it is one of the most advanced language models currently available.
ChatGPT when
you break it down is an algorithm
that analyzes natural language patterns
to generate human-like text in
response
to a user input. This raises the issue of user
input. People when typing in
ChatGPT are under the impression that it is confidential
because it is like my journal. You are not talking to a physical person you are using
ChatGPT like my journal for self-expression, problem-solving, decision-making and a lot
more. OpenAI:
statement
on this matter is that it only saves searches to train and
improve its algorithm, but this raises the theme of Accountability which is the story
OpenAI is deciding to tell to avoid the blame
of collecting specific user sensitive data.
Even if OpenAI was telling the truth. Rory Mir,
associate
direct for a privacy rights
nonprofit group, describes this as at some point that data they’re holding onto may
change hands to another company you don’t trust that much or end up in the hands of a
government you don’t trust that much.
This brings me to the privacy and security aspect. As I mentioned we view things like
web searches and
ChatGPT like journals they private and secure. To Jeffery Chester,
executive director of Center for
Digital
Democracy a digital rights advocacy group,
Consumers should view these tools with suspicion at least, since-like so many other
popular technologies-they are like influenced by the forces of advertising and marketing.
Basically saying we
should avoid looking at ChatGPT and other language models like
journals. This connects to
Eudaemonism
who determines what is right and wrong when
discussing privacy and security. Is it OpenAI? Is it the government? Is it individual
people? We must purse the good and general welfare.
5Companies especially banks have already banned ChatGPT. It's not just an anti-
technology stance, however: The banks simply won't allow the use of third-party
software without
a thorough vetting, which makes sense when you consider that their
entire industry lives and dies on
keeping
its clients' money
secure. Samsung just banned
the use of chatbots by all its staff at the consumer electronics giant. This puts an end to
staff access to ChatGPT
, Bard and Bing after sensitive corporate secrets were accidentally
leaked by employees on chatbots.
8Specifically looking at ChatGPT and AI chatbots as I mentioned they do save chat
history for training. Which is to improve the model. As I mentioned this
raises privacy
and security concerns when I
ask ChatGPT about a specific medical condition I have.
Models give you the option of opting of chat data
collection
or deleting their chat
history. However, we do not know anything for sure. This could just be a narrative that
OpenAI is using to access
user
data. Also, ChatGPT does not
make it know when you sign
up that they have this feature and they collect user data.
Beyond ChatGPT what are companies using your chats for. For ChatGPT and Googles
Bard, we are told
they use the questions and
responses
to train the AI models to provide
better answers. However, Chat logs could also be used to advertisement. For example, if
I search up cancer treatment and
symptoms
in ChatGPT I will see more cancer ads when
I search the internet.
21Additionally, there is actual evidence of this where WebMB and Drugs.com which are
both popular online sources of medical information, providing users with resources to
research various
health
topics, medications, and treatment options. Shared sensitive
health concerns to advertisers for depression
and
HIV. These
people
are data brokers
who sell the lists of people with health concerns for targeted advertisement. Chronically
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help