M2 video test practice PDF

Title M2 video test practice
Course Science, Technology, and Society
Institution University of North Carolina at Charlotte
Pages 2
File Size 58.5 KB
File Type PDF
Total Downloads 45
Total Views 127

Summary

Test practice...


Description

Artificial Intelligence and Algorithms Instructions: Answer the questions below using complete sentences. When you’re finished, turn in your completed response on Canvas. 1. According to Seaver, what is Conway’s Law? According to Seaver, Conway’s Law is the idea that software systems take the shape of organizations that make them. (Nick Seaver 2018) 2. According to Seaver, what makes algorithms powerful in society? Algorithms are powerful in society they can identify potential criminals, decide loan eligibility, and shape the news feed. Seaver also states that algorithms leave their critic and readers to fight on behalf of humans. Seaver mentions that algorithms threaten society and culture by reformatting our lives. 3. In Seaver’s article, why did Jacky Alciné believe that the algorithm mislabeled him and his friend as gorillas? Jacky Alcine believed the algorithm mislabeled him and his friend because of the lack of diversity among algorithms and the team who produced it. 4. According to Seaver, what problems could arise if we “grant algorithms autonomy?” If we grant algorithms autonomy, then we won’t be able to make any sense of any story. We will never know the reason behind a certain algorithm. 5. What is the Impact Pro algorithm designed to do? (You can find this information in the article by Obermeyer et al. or in the related Business Insider article) The Impact Pro algorithm is designed to identify which patient would benefit from complex health procedures. It favored white patients than sicker black ones. 6. According to Obermeyer et al., what information goes into the Impact Pro data set? Demographics, chronic illnesses, Biomarker values, care utilization, and care management program are all within the Impact Pro data set. 7. According to Obermeyer et al., is data on race collected by Impact Pro? The data on race is collected by the hospitals via patient forms. If they identify as black, they are put into the category of their race. 8. According to the articles, how does Impact Pro cause white and non-white patients to be treated differently? The impact pro predicted black patients would cost less which to medical providers, means their illness must not be bad. This is because the lack of healthcare among the black community and an abundance of healthcare among the white community.

9. According to Nguyen, what is a consequence of the LAPD’s use of PredPol, a predictive policing software? One consequence of the LAPD’s use of PredPol Is the impact on black and brown communities. 10. As we can see from these examples, artificial intelligence is subject to the bias of its human users and creators. What do you think could be done to reduce bias in the world of artificial intelligence and algorithms? I looked up a few ways to help with bias involved in algorithms to give me a few ideas I can look more into and I read an article from the Harvard Business Review written by Brian Uzzi that suggested the “blind taste tests”. They compared it to the blind taste tests for Pepsi and Coca-Cola where there were no labels. By incorporating this into an algorithm where information that may bias the outcome is omitted may give a start to resolved biased algorithms. The article is linked below. https://hbr.org/2020/11/a-simple-tactic-that-could-help-reduce-bias-in-ai...


Similar Free PDFs