Technology

Google's New Pledge Not to Be Evil

achinthamb / Shutterstock.com

Alphabet Inc. (NASDAQ: GOOGL) is supposed to be good and not evil, according to an early statement of its principles. That has not always worked out. Their self-driving Waymo cars are not entirely safe, like most others. Government officials, particularly in the European Union, think that Google holds too much of the search market. Google also collects customer data, which worries privacy advocates. Now, the company has decided to post a number of things it won’t do, particularly with its artificial intelligence (AI) operations.

The focus of the decisions is that Google will not let its AI to be used for military products. Evil or not, those applications are ultimately developed to make war.

Beyond that, Sundar Pichai, CEO of Google, wrote in a blog:

[W]e will not design or deploy AI in the following application areas:

Technologies that cause or are likely to cause overall harm. Where there is a material risk of harm, we will proceed only where we believe that the benefits substantially outweigh the risks, and will incorporate appropriate safety constraints.

Weapons or other technologies whose principal purpose or implementation is to cause or directly facilitate injury to people.

Technologies that gather or use information for surveillance violating internationally accepted norms.

Technologies whose purpose contravenes widely accepted principles of international law and human rights.

We want to be clear that while we are not developing AI for use in weapons, we will continue our work with governments and the military in many other areas. These include cybersecurity, training, military recruitment, veterans’ healthcare, and search and rescue. These collaborations are important and we’ll actively look for more ways to augment the critical work of these organizations and keep service members and civilians safe

Why? Google decided to go back to some of its roots. Its products are meant to help society, even if it occasionally looks at some people’s private data. It does not want to foster “unfair bias.” Some of its ads may have been placed by people who want to do otherwise. Google has tried to keep that off its sites. It also wants to be “accountable to people.” Ditto the personal data collection issue.

Google has a set of new rules that look good on paper. Beyond that, some may not be in place now.

 

Thank you for reading! Have some feedback for us?
Contact the 24/7 Wall St. editorial team.