As per Business Insider’s report, Google AI tool will no longer add gender tags like “man” or “woman” to images; instead, it is to identify them as “person”. The reason given is it is not possible to identify a person’s gender by their face.
In an email sent to its developers, Google stated that its Cloud Vision API, the tool used by AI to identify every item in an image like faces, landmarks, brand logo, objects or any other features will no longer specify “man” or “woman” labels to face recognized in the picture.
Citing the changes in its AI tool, Google sent an email to the developers which read as: “Given that a person’s gender cannot be inferred by appearance, we have decided to remove these labels in order to align with the Artificial Intelligence Principles at Google, specifically Principle #2: Avoid creating or reinforcing unfair bias.”
Bias in AI algorithms is the talk of the market; the most common example is: AI’s picture recognition algorithm misidentifying people based on colour, particularly the black people. In 2015, it had been reported that Google Photos labelled a software engineer’s black friends as “Gorillas”. Google had promised to solve this matter but nothing much was done till 2018, apart from AI algorithm stopped recognizing Gorillas.
The AI principle that Google had specified in the email says: they will try their best to circumvent such unjust trolls on people’s sentiments, especially on sensitive topics like gender, caste, annual income, nationality, sex, beliefs and other factors.
To know the market response, Google had asked for the developer’s opinion on this change in AI principle.