Agency: Mary Margaret Clouse

      3 Comments on Agency: Mary Margaret Clouse

Agency reminds me a lot of our readings and discussions surrounding data in that agency is largely related to what an algorithm knows about the user (as provided by data) and then agency is determined based on this. Noessel mentioned how agents would ensure that “Your interests find you” which makes me question then how much of one’s interests would be self-determined. How much are they now? I’m sure that agents built into apps like Spotify have suggested music to me, which has likely led to my discovery of some of my favorite artists, but what happens when these agents reinforce danger or violence online? The Noessel article suggests a very positive quality to the influence of agents, focusing on how they will make life easier. However, I cannot help but fear the lack of autonomy that may come with self-driving cars, algorithms catered to one’s interests, etc. I feel that this may decrease the scope of our skills and knowledge. Will there one day be teenagers who don’t have to learn how to drive? As convenient as it sounds, this increased focus on making things easier can become quite dangerous.

3 thoughts on “Agency: Mary Margaret Clouse

  1. Thomas Takele

    I had thought about how invasive agency could become but I had never thought about it to the degree you are suggesting. I think it is a great idea to think about the effects of agencies in the long term and how they could limit us in the future with the amount of “convenience” they provide for us. I think you provided a great example with self-driving cars because they are something that put the lives of people at risk and could lead to death if it is not evaluated properly. Another example of an agency that could be dangerous automated homes. although very convenient when people can say “Hey Alexa” and get everything done within their home it decreases movement needed within a home which can lead to higher rates of obesity.

  2. Grace Brogan

    Your description of the influence of technological interests on the self is thought-provoking. I especially agree with your last statement on the theoretical danger of convenience. Although having platforms suggest things that they think we will like is convenient and helpful in theory it does seem to take away a certain level of human autonomy. This method of algorithmic suggestion can have real world consequences, such as the evidence that platforms such as TikTok or Facebook may start suggesting increasing wild conspiracy theories to people who may be especially prone to believing in them.

  3. Benjamin Cudmore

    I think it is fascinating to consider that currently some of our favorite songs, movies, or podcasts could be the result of our agents picking them out for us. As a little kid, I was not surrounded by media that generated results based on a technological agency. However, a comparison I can draw from your example is that our parents act similarly like agents. My parents knew my interests growing up and would choose things for me to read, watch, or listen to, mirroring how a search results adapt to match its users interests. Additionally, I find the question you propose at the end of your post ventures away from the concept of agency. While you can select a destination for your car to drive, I think it is a stretch for the choices the car makes to get you there are to be based on what the algorithm thinks would match with your personal interest.

Leave a Reply