Last week, we attended an event hosted by Landor, titled, “Tech. Gender. Brand,” which shed light on some of the more important issues companies today face in terms of gender and its position in the workplace. Insights from the likes of Reporter Lydia Dishman, IBM Founder Nancy Kramer, Squarespace’s head of Diversity Lisa Lee, and CEO of Lowekey Greg Lowe II, were shared. The overarching theme seemed to focus on how today’s brands need to start creating a voice for themselves that aligns with this generation's values, one of those values being not so gender biased.
Advertising decisions can be biased because of unconscious, preconceived notions about gender roles.
Gender roles have played a key part in the way people think, behave, and interact with one another since the beginning of time. That being said, it only seems natural that gender roles and norms would play a part when thinking about and analyzing how companies advertise products.
When looking at America’s history in advertising, we are able to see how this came about; from infamous tobacco companies like Nebo Cigarettes creating ads objectifying women since as early as 1912, to even more recent examples like Carl’s Jr.’s 2015 “All Natural” Super Bowl commercial over-sexualizing women. These are just a couple examples of how women are portrayed and objectified through media.
Is Machine Learning Going to Solidify this Bias?
Other than the actual ads, a prime example of females being sexualized in the tech industry can be seen when analyzing products such as Amazon’s Alexa or Microsoft’s Cortana. These female personas are literally made to be ordered around within someone’s home, and programmed to be knowledgeable on all topics; sound like a demographic we know of? Or what about travel and hospitality companies like Uber and Airbnb that constantly struggle with female client safety issues? These unconscious gender norms are more often than not, shrugged aside in the industry. Not acceptable.
Not only is the aforementioned shrugged aside, it's literally disappearing; and by “it” we’re referring to females. Machine learning has made it so females are legitimately losing visibility on social platforms. Ever search for a female name on LinkedIn like “Stephanie Williams” for example, and get a response such as “Did you mean Stephan Williams?” LinkedIn’s algorithm wasn’t even in favor of women (WOW). A company that’s meant to be a professional networking site was designed so women weren’t considered as connected as men. LinkedIn recently modified this “glitch”, but the theory still stands.
Today’s generation is all about new experiences, and as discussed in our previous blog posts, in order for brands to become iconic, they need to be upfront with their consumers, and stay honest and relevant. Part of this relevancy, is steering away from these unconscious gender stereotypes and just being true to their brand.
Landor’s goal with this panel discussion was to bring light to the ideas associated with gender, specifically in the tech industry, and really create a dialogue that is often times ignored in the professional world. Technology has uncovered these uncomfortable truths about cliché gender norms, and it is time we took a step forward and forget about these unconscious tendencies. The only way to make these changes are with these types of discussions, and hopefully with time, tech will be an integrated and diverse part of society.