🌐 Week Eight: Digital Citizenship and Software Literacy - Instagram Filters 📸✨
In today’s post, we’re diving into the world of Instagram filters and their role in shaping digital citizenship and software literacy. Filters are more than just fun effects - they influence how we see ourselves and others, sometimes altering perceptions in profound ways. With their growing popularity, understanding the implications of filters can help us navigate social media more responsibly.
[[MORE]]
🌈 The Allure and Impact of Instagram Filters
Instagram filters are a staple of visual social media, allowing users to modify their photos with the tap of a screen. While these filters can be creative and entertaining, they also present a version of reality that isn’t entirely authentic. Filters act as cultural tools that shape our perceptions, subtly reinforcing beauty ideals by smoothing skin, brightening eyes, or changing skin tones (Rettberg 2014). Filters, in this way, aren’t just decorative - they influence what we see as desirable and even “normal.”
As users, we should be aware of how filters affect our self-image and interactions with others. Filters can reinforce narrow beauty standards that leave little room for diversity. For example, beauty filters can often alter facial features to align with Eurocentric ideals, which can be damaging to self-esteem and cultural identity (Baker 2020). This isn’t just a minor issue; it’s a form of digital literacy we must cultivate to critically engage with the content we create and consume.
You can easily find a series of “HOW TO LOOK LIKE AN INSTGRAM FILTER IN REAL LIFE” on Google.
👁️ Biometric Influence: Filters and Machine Vision
Instagram filters don’t just modify images; they also teach us how to interact with machine vision. Filters work by using augmented reality (AR) to superimpose changes onto our faces in real-time, effectively turning our selfies into data for machine learning (Rettberg, 2017). This process can affect our self-perception, as we adapt our images to match the “ideal” encoded within the filters. According to Rettberg (2017), our engagement with these filters is part of a broader trend toward biometric citizenship, where our faces become data points in algorithmic systems (Rettberg, 2017).
The implications of this are significant. As we grow accustomed to using filters, we might be subtly shifting our ideas of beauty and self-worth to align with machine-driven standards. It’s essential to develop software literacy so we can understand the influence these tools have on us and use them more mindfully.
🔍 Navigating the Ethics of Filter Use
Understanding software literacy means recognizing both the benefits and drawbacks of filters. While filters can be fun and allow for creative expression, they can also lead to digital deskilling, as we rely on pre-made effects rather than creating original content. Filters encourage us to present polished versions of ourselves, which might not always reflect reality. By questioning our use of filters, we engage in digital citizenship, making informed choices about how we want to be perceived online (Barker, 2020).
In light of this, digital citizens can make a conscious effort to either reject or use filters in ways that support genuine self-expression and inclusivity. It’s also helpful to encourage conversations about the impact of filters on body image and self-esteem. As Rettberg (2014) reminds us, filters are both cultural and technological tools that shape our experiences—so let’s make sure we’re shaping them with intention.
References
Barker, J 2020, Making-up on mobile: The pretty filters and ugly implications of Snapchat, Fashion, Style & Popular Culture, vol. 7, no. 2 & 3, pp. 207-221, DOI: 10.1386/fspc_00015_1.
Rettberg, JW 2014, Seeing Ourselves Through Technology: How We Use Selfies, Blogs and Wearable Devices to See and Shape Ourselves, Palgrave Macmillan, Basingstoke, DOI: 10.1057/9781137476661.0004.
Rettberg, JW 2017, ‘Biometric Citizens: Adapting Our Selfies to Machine Vision’, in Selfie Citizenship, ed. Kuntsman, A., Springer, Cham, DOI: 10.1007/978-3-319-45270-8_10.