As cities and states roll out algorithms to assist them present providers like policing and visitors administration, they’re additionally racing to give you insurance policies for utilizing this new know-how.
AI, at its worst, can drawback already marginalized teams, including to human-driven bias in hiring, policing and different areas. And its selections can usually be opaque—making it tough to inform easy methods to repair that bias, in addition to different issues.