HUD releases guidance on risk in using AI for advertising

The


Department
of
Housing
and
Urban
Development

released
guidance
about
the
potential
risk
of
noncompliance


with
fair
housing
laws

and
the
use
of
artificial
intelligence.

Covering
platforms
providing
targeted
digital
advertising
as
well
as
rental
tenant
screening,
HUD’s
notices
come
following
a
presidential
order
last
year
that
directed
the
department
to
address
AI
concerns,
particularly
how
automated
technology
might
enable
housing
discrimination.


Federal
officials,
likewise,
appear
to
be
increasing
their
focus
on
threats
of
artificial
intelligence

and
the
potential
harmful
impact
to
American
interests. 

“Housing
providers,
tenant
screening
companies,
advertisers,
and
online
platforms
should
be
aware
that
the
Fair
Housing
Act
applies
to
tenant
screening
and
the
advertising
of
housing,
including
when
artificial
intelligence
and
algorithms
are
used
to
perform
these
functions.”
said
Demetria
McCain,
principal
deputy
assistant
secretary
fair
housing
and
equal
opportunity,
in
a
press
release.

In
its
guidance
for
housing-related
ads,
HUD,
which
oversees
the
Federal
Housing
Administration,
warned
of
the
dangers
within
algorithmic
functions
in
AI
tools
that
may
lead
businesses
to
unintentionally
engage
in
discriminatory
practices.

“Algorithmic
delivery
functions
may
operate
to
exclude
protected
groups
from
an
ad’s
audience
or
to
concentrate
delivery
to
a
protected
group

an
outcome
particularly
problematic
for
predatory
products,”
HUD
said. 

HUD
noted
a
platform’s
reliance
on
algorithms
may
eventually
lead
to
advertising
that
results
in
price
discrimination,
even
when
businesses
make
an
effort
to
target
a
diverse
set
of
consumers.

Campaign
outcomes
may
also
cause
AI
systems
to
employ
a
discriminatory
ad-delivery
process
based
on
the
number
and
type
of
audience
interactions
it
has,
as
well
as
any
disparities
models
may
have
been
trained
on.

Similarly,
the
creation
of
customized
and “mirror”
sets
of
consumers
designed
to
match
certain
characteristics,
such
as
attendees
of
an
open
house,
could
end
up
running
afoul
of
laws
when
algorithms
come
up
with
the
list
of
recipients,
even
when
the
original
data
did
not
eliminate
any
protected
classes. 

Among
precautions
HUD
advised
advertising
platform
vendors
to
take
are
separate
processes
for
running
housing-related
ads
and
choosing
audience
segments,
as
well
as
specialized
interfaces.
The
department
also
recommended
platforms
avoid
providing
target-consumer
choices
for
ad
delivery
that
might
be
discriminatory.  

It
additionally
called
for
assessments
of
the
data
used
to
train
AI
systems
and
safeguards
to
ensure
algorithms
are
similarly
predictive
across
all
class
groups,
with
adjustments
made
as
necessary. 

At
the
same
time,
real
estate
related
businesses
that
advertise,
such
as
agents
and
lenders,
should “carefully
consider
the
source,
and
analyze
the
composition,
of
audience
datasets
used
for
custom
and
mirror
audience
tools
for
housing-related
ads”
among
different
platforms
when
selecting
which
to
purchase. 

Advertisers
would
benefit
from
monitoring
the
outcomes
of
their
campaigns
to
identify
and
mitigate
risk
of
noncompliance
related
to
the
Fair
Housing
Act
as
well.

In
other
guidance
regarding
AI
use
in
the
rental
market,
HUD
said
tenant-screening
services
and
housing
providers
would
be
better
off
with
transparent
machine-learning
models
in
their
selection
process. 

“If
a
highly
complex
model
has
a
discriminatory
effect,
the
model’s
lack
of
transparency
could
make
it
hard
to
prove
that
a
legally
sufficient
justification
exists
for
the
criteria
used
for
a
denial
decision,”
HUD
said.

The
guidance
also
arrives
after
several
settlements
and
lawsuits
in
the
past
two
years
point
to
the
priority
that
federal
departments
and
agencies
are
placing
on


enforcement
and
elimination
of
redlining

and
other
forms
of
housing
discrimination. 

“Under
this
administration,
HUD
is
committed
to
fully
enforcing
the
Fair
Housing
Act
and
rooting
out
all
forms
of
discrimination
in
housing,”
said
Acting
Secretary
Adrianne
Todman. 

Comments are closed.