Among the multiple artificial intelligence (AI) and machine learning (ML) initiatives announced by Google last week at its annual I/O developer conference, there were a number of projects to train the internet company’s AI platforms. One such development was the introduction of a 10-shade Monk Skin Tone (MST) Scale—something that could have a broader sociological significance as technology platforms witness deeper intersections with society at large. In Google’s words, the MST Scale will “support inclusive product and research across the industry”.
What is the Monk Skin Tone Scale?
Developed in partnership with Dr Ellis Monk, associate professor of sociology at Harvard University, the Monk Skin Tone (MST) Scale is a tool that will be primarily incorporated by Google into computer vision, which is a type of AI that allows computers to see and understand images. It has been found that computer vision systems often do not function as efficiently for people of darker skin as they do for those with fairer complexion. Using the MST Scale, Google and the tech industry are aiming to build more representative datasets so that such AI models can be trained to identify a wider range of skin tones in images.
How will it work?
According to Google, the scale will “make it easier for people of all backgrounds to find more relevant and helpful” search results. For instance, users who search for makeup or beauty tutorials in Google Images will see an option to refine search results further by skin tone. Going ahead, Google will use the MST Scale to better detect and categorize images to give a larger range of results.
The tech giant plans to further expand the use of this schema – the structure of the database created based on different skin tones – so that creators and online businesses may label their content or products based on other attributes, such as hair color and hair texture. Google has openly released the scale so that anyone can use it for research and product development.
Why the MST Scale?
According to Dr Courtney Heldreth, a social psychologist and user experience (UX) researcher at Google’s Responsible AI Human-Centered Technology UX department, “persistent inequities exist globally due to prejudice or discrimination against individuals with darker skin tones, also known as colourism”. And AI does not accurately see skin tone, which could further lead to existing inequities, is a type of colourism.
To bridge this gap, a Google research team including Heldreth and Xango Eyeé, a product manager working on Responsible AI, focused on bringing more skin tone equity to AI development. The team last year partnered with Monk, whose research has focused on how factors like skin tone, race and ethnicity affect inequality.
The foundation for the MST scale was laid upon the existing Fitzpatrick scale, developed by American dermatologist Thomas B Fitzpatrick in 1975, which classified human skin type into seven broad colors. The Google team and Monk arrived at a scale composed of 10 shades – a range pegged to be not too limiting but also not too complex – and surveyed thousands of adults in the US who felt more represented with the new methodology.
Are there similar developments elsewhere?
Several large corporations stepped up their efforts towards colour-based inclusiveness in light of the 2020 Black Lives Matter protests, which had happened in aftermath of police brutality and the killing of 46-year old African American George Floyd. In 2020, Johnson & Johnson-owned Band-Aid launched a new range of adhesive bandages in different shades of black and brown to make its products more inclusive for people of color. In India, after backlash and strong public feedback on discrimination against people with darker skin tones, the Advertising Standards Council of India (ASCI) in 2014 released guidelines for the advertising of skin lightening and fairness products. More recently, in 2020, Hindustan Unilever-owned cosmetic brand Fair & Lovely was renamed Glow & Lovely, after criticism for promoting colorism in its advertisements and marketing campaigns.