Home Politics Microsoft Strengthens Restrictions on Police Utilizing Generative AI for Facial Recognition

Microsoft Strengthens Restrictions on Police Utilizing Generative AI for Facial Recognition

0

0:00

Microsoft has updated its policy regarding the use of generative AI for facial recognition by U.S. police departments through its Azure OpenAI Service. The TechCrunch report highlights that Microsoft now explicitly prohibits the integration of Azure OpenAI Service with police departments in the United States for facial recognition purposes. This prohibition extends to any current and potential future image-analyzing models developed by OpenAI.

The revised policy also includes a global ban on the use of real-time facial recognition technology on mobile cameras, such as body cameras and dashcams, to identify individuals in uncontrolled environments.

These changes come in light of Axon’s announcement of a new product that utilizes OpenAI’s GPT-4 generative text model to summarize audio from body cameras. Critics have raised concerns about potential issues with this application, including the creation of false facts by generative AI models and the introduction of racial biases from training data.

While it is not clear if Axon was using GPT-4 through Azure OpenAI Service and if the updated policy was prompted by their product launch, it is consistent with Microsoft and OpenAI’s recent approach to AI-related law enforcement and defense contracts.

The new terms specify a complete ban on the usage of Azure OpenAI Service by U.S. police departments, excluding international law enforcement. The ban does not cover facial recognition with stationary cameras in controlled environments, but any use of facial recognition by U.S. police is strictly prohibited.

Microsoft and OpenAI have been engaging with government agencies recently, with OpenAI collaborating with the Pentagon on cybersecurity projects and Microsoft proposing the use of OpenAI’s tools for military operations.

For more information, visit the TechCrunch article here.

No comments

Leave a reply

Please enter your comment!
Please enter your name here

Exit mobile version