INFO331 Q&A WK 5 Forum- 100%
.docx
keyboard_arrow_up
School
American Military University *
*We aren’t endorsed by this school
Course
331
Subject
Information Systems
Date
Jan 9, 2024
Type
docx
Pages
2
Uploaded by ggggddddoooogggg
WK5: Business Processes and Your Information System
Forum:
People all over the world share videos on YouTube, and the government even uses machine learning to watch how people use the site. The way the platform shows users the videos they are most likely to watch and interact with changes all the time. YouTube says that its selection system finds videos for people to watch, so they don't have to look for videos to watch. When a person goes to YouTube, the system pulls videos that are related to what they are interested in. This search and discovery method's main goal is to show each viewer a video that they are most likely to enjoy and find interesting. Remember that each viewer's suggestions will look different because they are based on
their own tastes and how they choose to watch.
How YouTube searches for videos is based on three main factors: relevance, interest, and quality. Title tags, descriptions, and search questions, which are just words, for videos all show how relevant something is. The amount
of time people spend watching a video is one way that YouTube can tell if it is useful for a certain search purpose. To find out how knowledgeable, authoritative, and trustworthy a YouTube channel is, there are many ways to use the site's search tools. When YouTube shows videos that match a search question, it's interesting that the algorithm also looks at what each person likes. For users, this means that the platform changes based on what they want and need.
Mark up content: Seven studies looked into how the YouTube recommender system helps spread extremist content. There were seven studies that looked at this topic. Six of them said that the recommender system encourages extremist content. Five studies have found a link between filter bubbles and recommender systems. One study says that the recommender system links extremist content with content that counters it from government or public interest
groups. It's possible for the system that makes counter-message content to lead people to extremist content with just two clicks. These examples show how complicated counter-messages can be and how they can expose people to extremist ideas.
Two studies looked at anti-vaccination videos on YouTube to give different points of view. In one study, they found that the platform's recommendation algorithm gave users who watched content that was against vaccines more weight. A second study says that YouTube suggests more videos that are in favor of vaccination than those that are against it. It could be because YouTube has recently been trying to fight false information and remove videos that are against vaccines. Because of the COVID-19 pandemic, which made people more aware of how important it is to have accurate health information, these projects may have sped up. If platforms want to be responsible for the content they host and how it affects public health, these studies show they should.
You can make downloads go faster by looking at the empirical CDF of the Download Rate for four different video formats. In every case, the first "s" request showed a fast download speed, but the speed slowly dropped with each subsequent request. The download speed reached a steady level after the fourth "s" request. What this means is that YouTube uses a fast-start system to fill up a large initial buffer very quickly. Once the buffer is full, the rate at which video data is sent to the browser slowly slows down until it reaches the lowest rate needed for the video to play smoothly in that format (How the YouTube Algorithm Works: What Marketers Need to Know, n.d.; (PDF) Analysis of YouTube User Experience from Passive Measurements, n.d.; Yesilada & Lewandowsky, 2022).
References:
How the YouTube algorithm works: What marketers need to know. (n.d.). Retrieved December 6, 2023, from https://searchengineland.com/how-youtube-algorithm-works-393204
(PDF) Analysis of YouTube user experience from passive measurements. (n.d.). Retrieved December 6, 2023, from https://www.researchgate.net/publication/271425641_Analysis_of_YouTube_user_experience_from_passive_measu
rements
Yesilada, M., & Lewandowsky, S. (2022). Systematic review: YouTube recommendations and problematic content. Internet Policy Review, 11(1). https://policyreview.info/articles/analysis/systematic-review-youtube-
recommendations-and-problematic-content
Response 1:
Great post this week as we discuss our information systems processes are and how they compete in the marketplace. It is interesting to see all the different companies that are talked about and how they compare to one another. Having
an algorithm that helps take user data/ videos watched and converting that into suggested videos to show the user is awesome. There are many ways that Spotify (my information system) is similar to that approach by taking what people listen to or watch in order to compute suggestions to the user. Thanks for the post and good luck!
Response 2:
I agree with you and Dr. Larson but I will say that there are interesting things that happen to some of my friends. Just the other day my friend who is a female body builder was speaking about the algorithm mistaking her interest in
body building with some other things where people do not wear much more than a bikini. I initially brought it up because something like that happened on my app as well. I liked and commented on some of her photos from the last competition that she did and then the algorithm started to make some recommendation that were not in line, but I could definitely see why they did it. I believe it is a valuable tool but like I said earlier in the class. this is an estimation of what you want not a sure thing. Response 3:
It's fascinating how YouTube's algorithm works, right? The way it tailors video suggestions based on individual user
preferences, using factors like relevance, interest, and quality, is pretty smart. It's like having a personal video assistant that knows exactly what you like. But then, there's this whole other side to it – the way YouTube's recommendation system can sometimes amplify certain types of content, like extremist views or anti-vaccination sentiments. It's a bit of a double-edged sword. The recent shifts towards promoting more authoritative health information, especially during the pandemic, show how these platforms are evolving and taking responsibility.
Considering the impact of these algorithms, how do you think users can better navigate YouTube to avoid misinformation and find quality content? Do you think there should be more transparent guidelines on how these recommendation systems work?
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help