Colorado’s new law bans high-tech discrimination

0
Source: Adobe Stock

Colorado state lawmakers are trying to get robots and computers to behave properly.

They recently passed SB 169, a bill that prohibits insurers from using algorithms, external data sources, and predictive modeling systems in a way that, from the perspective of lawmakers, appears to discriminate against people. on the basis of “race, color, national or ethnic origin, religion, sex, sexual orientation, disability, gender identity or gender expression.

The new law applies to issuers of life, disability and long-term care insurance, and to issuers of annuities, as well as to property and casualty insurers.

Governor Jared Polis, a Democrat, signed the bill earlier this month.

Data types

Lawmakers have tried to protect the ability of life, annuity, long-term care and disability insurance issuers to use traditional pricing factors, family history, medical tests, professional information, ‘disability and behavior that’ on the basis of valid actuarial principles, have a direct relationship with mortality, morbidity or longevity risk.

But the new law prohibits the use of this type of information, even when the information has a direct link with the risk of mortality, morbidity or longevity, if this information comes from an algorithm or a predictive model that uses “external data and sources of information on consumers”, if the use of this information has the effect of unfairly discriminating against the categories of protected persons.

An insurance underwriting “algorithm” is a set of rules that a computer or human can use to make decisions about whether to sell insurance to a claimant and how much to charge the claimant for the insurance. .

A “predictive modeling system” is software that helps a computer use data, rules about how the world works, and statistical methods to make predictions.

The new law defines “external consumer data and information sources” as “any data or source of information that is used by an insurer to supplement traditional underwriting or other insurance practices or to establish lifestyle indicators that are used in insurance practices ”.

Some of these new types of data sources are “credit scores, social media habits, locations, buying habits, homeownership, education level, occupation, business permits. ‘exercise, civil judgments and court records’.

Implementation

The new law directs Michael Conway, Colorado’s insurance commissioner, to develop regulations that will show insurers what to do to demonstrate that the use of algorithms, predictive models and external data and information does not lead. unfair discrimination against protected categories of people.

Insurers and other parties will have the opportunity to respond to the new law during a public comment period. The review of the Insurance Commissioner is supposed to include consideration of any impact on the solvency of the implementation of the rules.

The law must now enter into force on January 1, 2023 at the earliest.

Insurers who believe the new rules are unenforceable may block implementation by persuading the Insurance Commissioner that the rules would harm their creditworthiness; by persuading the legislator or the commissioner to postpone the date of entry into force; by persuading the legislature or the commissioner to repeal or amend the new law; or by opposing the new law in court.

The Consumers Federation of America

The Consumer Federation of America and other consumer groups have fought for years to persuade state lawmakers and state insurance regulators to prevent insurers from using automated and other credit analysis systems. high technology in such a way as to lead to unfair discrimination.

Douglas Heller, a federation representative, said in a comment welcoming Colorado’s new law that the law “directly targets insurance practices that have unfair and illegal results, regardless of the intent behind the practice.”

What this means

Science fiction writer Isaac Asimov developed the Three Laws of Robotics at a time when writers showed much more interest in robots than computers.

One version of Asimov’s first law states that “a robot cannot harm a human under any condition – and, as a corollary, must not allow a human being to be harmed due to the inaction of its go.

Colorado’s new law means that one of the first high profile laws governing computer entities could be seen as prohibiting computer entities from discriminating against protected categories of people when those people purchase insurance, even if the discrimination is not intentional. .


Source link

Share.

About Author

Comments are closed.