John Berger’s seminal work, Ways of Seeing, published in 1972, revolutionized art criticism and visual culture theory. Berger deconstructed the hidden ideologies embedded within Western art, exposing how paintings, particularly those depicting women, reflected and reinforced patriarchal power structures. Almost half a century later, we face a new visual landscape, dominated not by oil paintings in museums but by algorithms shaping our digital experiences. This essay explores how Berger's insights remain powerfully relevant in the digital age, arguing that algorithms function as a new form of 'seeing,' subtly yet profoundly influencing our perceptions, beliefs, and behaviors.
Berger’s Original Thesis: Deconstructing the Gaze
Ways of Seeing argued that our understanding of art is not innate but socially constructed. Berger challenged the notion of objective appreciation, revealing how historical context, ideological biases, and power dynamics shape our interpretations. He famously demonstrated how European oil paintings often presented women as objects of male desire, reflecting and perpetuating a patriarchal gaze. He analyzed advertisements, showing how they employed similar techniques to equate consumer goods with idealized versions of beauty and happiness. Berger’s key insight was that "men act and women appear," highlighting the asymmetrical relationship embedded in visual representation. The male gaze, according to Berger, transforms women into objects of scrutiny and judgment, their worth determined by their perceived attractiveness to men.
“Men act and women appear. Men look at women. Women watch themselves being looked at. This determines not only most relations between men and women but also the relation of women to themselves. The surveyor of woman in herself is male: the surveyed female. Thus she turns herself into an object - and most particularly an object of vision: a sight.” - John Berger, *Ways of Seeing*
Central to Berger's analysis was the concept of *mystification*. He argued that traditional art criticism often cloaked art in an aura of exclusivity and genius, obscuring the underlying social and economic forces that shaped its creation and consumption. By demystifying art, Berger aimed to empower viewers, encouraging them to question the dominant narratives and develop their own critical perspectives.
Algorithms as Ways of Seeing 2.0: The New Gaze
The digital age presents a new set of visual challenges. Algorithms, the complex sets of instructions that govern our online experiences, have become the primary gatekeepers of information, entertainment, and even social connection. These algorithms, often opaque and difficult to understand, filter and curate the content we see, subtly shaping our perceptions of the world.
Consider the algorithms that power social media platforms like Facebook, Instagram, and Twitter. These algorithms analyze vast amounts of data about our online behavior – our likes, shares, comments, and searches – to personalize our feeds. While this personalization can be convenient, it also creates *filter bubbles* and *echo chambers*, limiting our exposure to diverse perspectives and reinforcing existing biases. The algorithms prioritize content that is likely to engage us, often amplifying sensationalist, emotionally charged, or polarizing viewpoints. This can lead to increased social division and a distorted understanding of complex issues.
The algorithmic gaze extends beyond social media. Search engine algorithms determine the information we find online, influencing our understanding of everything from scientific research to political events. Recommendation algorithms shape our consumption habits, suggesting products, movies, and music based on our past purchases and preferences. Even dating apps rely on algorithms to match us with potential partners, shaping our romantic prospects.
This pervasive algorithmic influence raises critical questions: Who controls these algorithms? What biases are embedded within them? And how can we maintain our autonomy in a world increasingly shaped by automated decision-making?
The Algorithmic Gaze and Gender: Reinforcing Stereotypes?
Berger’s analysis of the male gaze remains relevant in the digital age, albeit in a transformed context. Algorithms can perpetuate and amplify gender stereotypes, often reflecting the biases of the programmers who create them and the data sets they are trained on. For example, facial recognition algorithms have been shown to be less accurate at identifying women and people of color, highlighting the potential for algorithmic bias to perpetuate existing inequalities.
Furthermore, online advertising algorithms often target women with ads for beauty products, weight loss programs, and domestic goods, reinforcing traditional gender roles. Conversely, men are frequently targeted with ads for cars, technology, and financial services. This algorithmic targeting can perpetuate harmful stereotypes and limit individuals' opportunities and aspirations.
The algorithmic gaze, like the male gaze Berger described, can also contribute to the objectification of women. Social media platforms often prioritize images that conform to narrow beauty standards, encouraging users to self-objectify and seek validation based on their appearance. This can have a detrimental impact on women's self-esteem and mental health.
Resisting the Algorithmic Gaze: Critical Awareness and Agency
While algorithms may seem like an invisible and unstoppable force, it is crucial to remember that they are not neutral or objective. They are created by humans and reflect human biases and values. Therefore, it is essential to develop critical awareness of how algorithms shape our perceptions and behaviors and to find ways to resist their potentially harmful effects.
One important step is to become more informed about how algorithms work. Understanding the basic principles of algorithmic decision-making can help us to identify potential biases and limitations. We can also support efforts to promote algorithmic transparency and accountability, demanding that developers and platforms be more open about how their algorithms function and the data they use.
Another strategy is to actively curate our own online experiences. We can choose to follow diverse sources of information, challenge our own biases, and engage in thoughtful discussions with people who hold different perspectives. We can also use privacy tools and settings to limit the amount of data that algorithms collect about us.
Furthermore, we can support the development of alternative algorithms that are designed to promote fairness, equity, and diversity. This includes investing in research and development of algorithms that are trained on diverse data sets, designed to be transparent and accountable, and subject to independent audits.
Ultimately, resisting the algorithmic gaze requires a collective effort. We need to engage in critical conversations about the social and ethical implications of algorithms and work together to create a digital landscape that is more inclusive, equitable, and empowering.
Here is a video that further explores the concepts of visual culture and media criticism:
Conclusion: Seeing Beyond the Algorithm
John Berger’s Ways of Seeing provided a powerful framework for understanding how visual representations shape our perceptions of the world. In the digital age, algorithms have become the new arbiters of visual culture, subtly yet profoundly influencing our thoughts, beliefs, and behaviors. By developing critical awareness of the algorithmic gaze, we can resist its potentially harmful effects and reclaim our agency in shaping our own visual experiences. We must strive to see beyond the algorithm, to question the dominant narratives, and to cultivate our own independent and critical perspectives. The future of visual culture depends on our ability to navigate the digital landscape with discernment, empathy, and a commitment to creating a more just and equitable world. Only then can we truly *see* for ourselves, free from the constraints of algorithmic control.