[ X ]
[ X ]
[ X ]
Image 1
BLONDE BRAIDS STUDY I
Image 3
BLONDE BRAIDS STUDY II
Image 3
BLONDE BRAIDS STUDY III
Image 2
BLONDE BRAIDS STUDY IV
Image 2
BLONDE BRAIDS STUDY V
Image 2
BLONDE BRAIDS STUDY VI
Image 2
BLONDE BRAIDS STUDY VII
Image 2
BLONDE BRAIDS STUDY VIII
Image 2
BLONDE BRAIDS STUDY IX
Image 2
BLONDE BRAIDS STUDY X
[ X ]

INTERVIEW EXCERPT
MAYA ANGELOU ONE-ON-ONE, 1983

Maya Angelou:
There was a woman in my town. Mrs Flowers. She was Black. She was so beautiful. She was that color Black where there was red under the Black, so that it looks purple. You know. So that if the sun touches it in a certain way, it is so gorgeous.
Interviewer:
Do Black folks call that a certain color?
Maya Angelou:
Oh yeah. It could be blue-black, or plum-black.
In describing a person's color, we call people cinnamon, toast, pomegranate. Somebody may be called ginger, honey colored.
Interviewer:
Really?
Maya Angelou:
Oh yeah.
Interviewer:
What do they call you?
Maya Angelou:
Well, I’ll be honey-colored. Or, they call me brown skin.
[ WATCH THE INTERVIEW ]

VISUAL SUPPLEMENT FOR BLONDE BRAIDS STUDY (2022) BY MINNE ATAIRU

This gallery is a visual supplement for my AI-generated portrait series titled Blonde Braids Study (2022). The series began with a question:

Can the text-to-image algorithm — Midjourney (V4) — accurately generate studio portraits of "blue-black" or "plum-black" complexioned twins sporting blonde braids?

Contrary to the envisioned prompt, the algorithm generated portraits of fraternal twins with rolled, permed and silky hairstyles. Their complexions recurred in two variations: [ 1 ] a juxtaposition of caramel and chocolate hues, and [ 2 ] matching caramel complexions.

Given these outputs, one might question: Why does Midjourney (V4), despite its advanced capabilities, diverge from the given description?

Bender et al. (2021) share critical insights: Generative systems (such as Midjourney) are designed to synthesize images based upon patterns discerned from large-scale, uncurated datasets. Amassed through indiscriminate web scraping, such datasets are shaped by the dynamics of internet participation, and therefore, represent the viewpoints of those with consistent internet access. "In the case of US and UK, this means that white supremacist and misogynistic, ageist, etc. views are overrepresented in the training data, not only exceeding their prevalence in the general population but also setting up models trained on these datasets to further amplify biases and harms." This insight prompts further interrogation: Do the above-described portraits mirror any biased perspectives? About Black hairstyles? To what extent are such perspectives symptomatic of natural hair discrimination - a form of bias marked by the hyperregulation and hypersurveillance of Black hair textures?

From the era of chattel enslavement to present-day, systems of oppression have shaped societal attitudes towards natural Black hairstyles such as afros, braids, twists, locks. Black hair is often per·ceived as 'bad' hair, 'woolly' hair, 'bushy' hair, 'unprofessional' hair, 'dreadful' hair, as indicative of the “angry Black woman" stereotype, as a marker of criminality. Consequently, Black folks feel pressured (Dawson et al., 2019) to conform to Eurocentric hairstyles by straightening/smoothing their naturally curly/coily hair. Yet, when they choose blonde-colored hairstyles — whether achieved through protective styling, chemical/thermal treatments, or synthetic extensions — this choice is deemed [ 1 ] "unnatural" for their racial identity, and [ 2 ] a violation of "hair privileges" traditionally reserved for Whiteness (Greene, 2010).

Such biases are evident in arbitrary workplace grooming policies. For example, in 1999, Andrea Santee-a "Black woman with dyed blonde hair" faced employment discrimination at Windsor Court Hotel in New Orleans. Santee's blonde hair was categorized as "an extreme hair color" under the hotel's grooming policy. In 2003, Shirley Bryant—a CUNY (City University of New York) employee with short curly blonde hair, experienced workplace harassment for deviating from "Afrocentric" hairstyles. Bryant's supervisor allegedly called her a "wannabe" (meaning: "want to be White"). In 2009, Dulazia Burchette—an employee at Abercrombie and Fitch was pressured to remove blonde highlights or resign (Greene, 2010). Burchette's supervisor allegedly said: "she should have the hair color she was born with". In 2017, Destiny Tompkins—a Banana Republic employee was subjected to workplace discrimination for sporting "urban" and "unkempt" blonde box braids. In 2019, Marion Scott—a student at Paragon Charter Academy was excluded from school photos due to red-colored braids (Abrahamson, 2019).

Building upon these observations, I delved into a secondary inquiry:

Does Midjourney's training data (LAION-5B) include images of melanin-rich folks wearing blonde braided hairstyles?

To explore, I conducted a NSFW-filtered image search on LAION’s database for two queries—'Blonde Box Braids' and 'Blonde Braids' (dated: November 11, 2023). The results overrepresented images of caramel-complexioned Black women and White women. "Blue-black", "plum-black", "chocolate","dark-skinned", "melanin-rich" women were underrepresented in the homogenized dataset. The search results — a collection of over 10,000 images — are viewable on this site.

TECHNICAL NOTES

LAION-5B is a dataset of over 5 billion images and their corresponding captions. The dataset is searchable through a portal (now-defunct), which leverages the capabilities of CLIP (Contrastive Language-Image Pre-training) for image retrieval. Unlike conventional search engines that rely on exact keyword matches , CLIP assesses images based on their conceptual relevance to text queries. This approach allows for the retrieval of images based on CLIP's interpretation of the terms rather than exact keyword matches. As such, the image results for the search queries—'blonde braids', and 'blonde box braids' include captions that do not explicitly mention these terms.

None of the images displayed in this gallery are hosted on the site. They are retrieved from external image URLs provided by LAION-5B dataset. All inactive image links are automatically identified and replaced with lime green placeholders.

REFERENCES
  1. Abrahamson, R. A. (2019, October 8). See what happened after a girl was denied a class picture because of her hair. Today. Retrieved from https://www.today.com/parents/michigan-girl-denied-yearbook-photobecause-her-hair-t164152
  2. Bender, E. M., Gebru, T., McMillan-Major, A., & Shmitchell, S. (2021, March). On the dangers of stochastic parrots: Can language models be too big?🦜. In Proceedings of the 2021 ACM conference on fairness, accountability, and transparency (pp. 610-623).
  3. Greene, D. W. (2010). Black women can't have blonde hair in the workplace. J. Gender Race & Just., 14, 405.
  4. Dawson, G. A., Karl, K. A., & Peluchette, J. V. (2019). Hair matters: Toward understanding natural black hair bias in the workplace. Journal of Leadership & Organizational Studies, 26(3), 389-401.
  5. MacLin, M. K., & Herrera, V. (2006). The criminal stereotype. North American Journal of Psychology, 8(2), 197-208.
  6. Schuhmann, C., Beaumont, R., Vencu, R., Gordon, C., Wightman, R., Cherti, M., ... & Jitsev, J. (2022). Laion-5b: An open large-scale dataset for training next generation image-text models. Ad