To wrap up our roundup of the biggest news stories from Google I/O we are bringing you news of a welcome update to Google Photos as well as a very positive move from Google that should help improve representation and inclusivity. Let’s dig in…
Google has been working with Harvard Professor and Sociologist Dr. Ellis Monk to incorporate the Monk Skin Tone (MST) scale into various Google products including Google Photos and Google Search, and also to improve the labeling and accuracy of datasets that will train the AIs running future Google products and the tech industry at large. The MST is a 10-shade scale that facilitates the incorporation of a broader of skin tones into tech products.
This is solving a problem that you might not know about if you don’t feel it yourself, but rest assured it is real and it is harmful to people to feel like the world doesn’t recognize them. You can see the positive impact this type of inclusion can have in the incredibly positive ways people have been responding to the recently launched OurTone Band-Aids that come in a variety of different skin tones.
To start with, Google Photos will use this new technology with the rollout of new Real Tone filters. These new filters will users to choose from a wider assortment of looks and styles that better reflect their own personal looks. This new set of features will roll out to Google Photos on Android, iOS, and the web in the next few weeks.
Another interesting way that Google is going to use the MST scale is to also incorporate it into web image searches. This means that when searching for things like makeup tutorials users will be able to further refine their searches to focus on results that better match their own personal skin tones, styles, and looks including for hair color and texture.
Out of all our updates from Google I/O this is our favorite. We love to see tech giants working hard to overcome issues that we risk forever embedding into the computer code that will help run the future. All the systemic bias that exists today, conscious or not, will be present in the data that feeds and trains the algorithms that run the programs and tools that we interact with on a daily basis. That is why search results normally turn up images of white people or show women working in home kitchens but men as professional chefs. We need to be proactive to prevent these biases from being forever enshrined in computer algorithms and it is great to see Google taking a lead relating to skin tone.
You can learn about the other big updates from Google I/O including our report on the new Immersive View in Google Maps and how Google Assistant is learning to talk more naturally.