CFPB calls for tech workers to whistleblow on unethical AI

The US Consumer Financial Protection Bureau has called on tech workers in the consumer financial products and services and fintech realms who see potential misconduct to report it.

“Since the CFPB began collecting whistleblower allegations a decade ago, we have seen the world transformed as data and technology, marketed as artificial intelligence (AI), have become commonplace in nearly every consumer financial market. These technologies can help intentional and unintentional discrimination burrow into our decision-making systems, and whistleblowers can help ensure that these technologies are applied in law-abiding ways.”

For example, while algorithmic mortgage underwriting is sometimes hailed as a method to significantly reduce housing discrimination, and many of those designing the algorithms seek to create a fairer housing market, that’s not always how things work out. In a recent study of over 2 million mortgage applicants, researchers found discriminatory effects of these new technologies, as Black and Hispanic families have been more likely to be denied a mortgage compared to similarly situated white families.

One researcher described the situation as one where loan officers take applicant information, but algorithms make the decisions. Whether such a process removes or embeds discrimination depends on a number of factors, including the types of data collected, how they are weighted, and how decisions are reviewed.

“I encourage engineers, data scientists and others who have detailed knowledge of the algorithms and technologies used by companies and who know of potential discrimination or other misconduct within the CFPB’s authority to report it to us,” the CFPB said in a statement. “”Whistleblowers can affect history, drive change, and defend individuals and families against corporate wrongdoing.”

The financial services watchdog has updated its reporting site with additional information about the mechanisms for submitting information and the process once information is submitted, as well as descriptions of the type of information it’s seeking such as information about the use of machine learning models and algorithmic bias related to consumer financial products and services.

Source

Related Posts

Previous Post
ISSA examines securities finance in CBDC post-trade settlement report
Next Post
ECB publishes action plan to fix TARGET tech failures by end-2022

Fill out this field
Fill out this field
Please enter a valid email address.

X

Reset password

Create an account