SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
'While the idea of using data to direct police resources sounds like an effort to remove human bias from the equation, that isn't how it works in practice. In fact, predictive policing embeds police bias in an algorithm that then has the appearance of being neutral.' (Photo: Tony Webster/flickr)
"Predictive policing" sounds good on paper. After all, what could go wrong with a data-based approach to law enforcement?
It turns out: plenty. That's why Free Press joined a broad coalition of civil rights, privacy and technology groups in sounding the alarm about how predictive policing reinforces racial bias.
"Predictive policing" sounds good on paper. After all, what could go wrong with a data-based approach to law enforcement?
It turns out: plenty. That's why Free Press joined a broad coalition of civil rights, privacy and technology groups in sounding the alarm about how predictive policing reinforces racial bias.
The Leadership Conference on Civil and Human Rights mobilized the coalition, which counts the ACLU, the Brennan Center for Justice, Color Of Change and the NAACP among the 17 signers. The statement released last Wednesday notes that "the data driving predictive enforcement activities -- such as the location and timing of previously reported crimes, or patterns of community- and officer-initiated 911 calls -- is profoundly limited and biased."
Indeed, a damning report from the tech consulting group Upturn, which surveyed the nation's 50 largest police forces, confirms this view. Upturn found "little evidence" that predictive policing works -- and "significant reason to fear that [it] may reinforce disproportionate and discriminatory policing practices."
Nearly all of the predictive-policing systems in use in the United States come from private vendors. The systems draw on existing crime data to forecast where future crimes might occur. The idea is that this knowledge will help police departments determine where to focus their law-enforcement activities.
While the idea of using data to direct police resources sounds like an effort to remove human bias from the equation, that isn't how it works in practice. In fact, predictive policing embeds police bias in an algorithm that then has the appearance of being neutral.
The Upturn report explains that "criminologists have long emphasized that crime reports, and other statistics gathered by the police, are not an accurate record of all the crime that occurs in a community; instead, they are partly a record of law enforcement's responses to what happens in a community" [emphasis added].
This is a critical point: The police response to low-income communities -- in particular communities of color -- is completely different from the response to wealthy white communities. A recent GenForward poll shows that two-thirds of young African Americans and 40 percent of young Latinos and Latinas have either personally experienced violence or harassment at the hands of police or know someone who has. This is a big reason why predictive policing is so problematic; as last week's coalition statement notes, it's not an objective tool but one "engineered to support the status quo."
As my colleague Sandra Fulton observed in a recent post on surveillance of communities of color, police often "race to adopt new technologies without considering the potential harms or consulting with the communities they serve." Predictive policing is yet another example of this dangerous trend. And people's lives are on the line.
Dear Common Dreams reader, The U.S. is on a fast track to authoritarianism like nothing I've ever seen. Meanwhile, corporate news outlets are utterly capitulating to Trump, twisting their coverage to avoid drawing his ire while lining up to stuff cash in his pockets. That's why I believe that Common Dreams is doing the best and most consequential reporting that we've ever done. Our small but mighty team is a progressive reporting powerhouse, covering the news every day that the corporate media never will. Our mission has always been simple: To inform. To inspire. And to ignite change for the common good. Now here's the key piece that I want all our readers to understand: None of this would be possible without your financial support. That's not just some fundraising cliche. It's the absolute and literal truth. We don't accept corporate advertising and never will. We don't have a paywall because we don't think people should be blocked from critical news based on their ability to pay. Everything we do is funded by the donations of readers like you. Will you donate now to help power the nonprofit, independent reporting of Common Dreams? Thank you for being a vital member of our community. Together, we can keep independent journalism alive when it’s needed most. - Craig Brown, Co-founder |
"Predictive policing" sounds good on paper. After all, what could go wrong with a data-based approach to law enforcement?
It turns out: plenty. That's why Free Press joined a broad coalition of civil rights, privacy and technology groups in sounding the alarm about how predictive policing reinforces racial bias.
The Leadership Conference on Civil and Human Rights mobilized the coalition, which counts the ACLU, the Brennan Center for Justice, Color Of Change and the NAACP among the 17 signers. The statement released last Wednesday notes that "the data driving predictive enforcement activities -- such as the location and timing of previously reported crimes, or patterns of community- and officer-initiated 911 calls -- is profoundly limited and biased."
Indeed, a damning report from the tech consulting group Upturn, which surveyed the nation's 50 largest police forces, confirms this view. Upturn found "little evidence" that predictive policing works -- and "significant reason to fear that [it] may reinforce disproportionate and discriminatory policing practices."
Nearly all of the predictive-policing systems in use in the United States come from private vendors. The systems draw on existing crime data to forecast where future crimes might occur. The idea is that this knowledge will help police departments determine where to focus their law-enforcement activities.
While the idea of using data to direct police resources sounds like an effort to remove human bias from the equation, that isn't how it works in practice. In fact, predictive policing embeds police bias in an algorithm that then has the appearance of being neutral.
The Upturn report explains that "criminologists have long emphasized that crime reports, and other statistics gathered by the police, are not an accurate record of all the crime that occurs in a community; instead, they are partly a record of law enforcement's responses to what happens in a community" [emphasis added].
This is a critical point: The police response to low-income communities -- in particular communities of color -- is completely different from the response to wealthy white communities. A recent GenForward poll shows that two-thirds of young African Americans and 40 percent of young Latinos and Latinas have either personally experienced violence or harassment at the hands of police or know someone who has. This is a big reason why predictive policing is so problematic; as last week's coalition statement notes, it's not an objective tool but one "engineered to support the status quo."
As my colleague Sandra Fulton observed in a recent post on surveillance of communities of color, police often "race to adopt new technologies without considering the potential harms or consulting with the communities they serve." Predictive policing is yet another example of this dangerous trend. And people's lives are on the line.
"Predictive policing" sounds good on paper. After all, what could go wrong with a data-based approach to law enforcement?
It turns out: plenty. That's why Free Press joined a broad coalition of civil rights, privacy and technology groups in sounding the alarm about how predictive policing reinforces racial bias.
The Leadership Conference on Civil and Human Rights mobilized the coalition, which counts the ACLU, the Brennan Center for Justice, Color Of Change and the NAACP among the 17 signers. The statement released last Wednesday notes that "the data driving predictive enforcement activities -- such as the location and timing of previously reported crimes, or patterns of community- and officer-initiated 911 calls -- is profoundly limited and biased."
Indeed, a damning report from the tech consulting group Upturn, which surveyed the nation's 50 largest police forces, confirms this view. Upturn found "little evidence" that predictive policing works -- and "significant reason to fear that [it] may reinforce disproportionate and discriminatory policing practices."
Nearly all of the predictive-policing systems in use in the United States come from private vendors. The systems draw on existing crime data to forecast where future crimes might occur. The idea is that this knowledge will help police departments determine where to focus their law-enforcement activities.
While the idea of using data to direct police resources sounds like an effort to remove human bias from the equation, that isn't how it works in practice. In fact, predictive policing embeds police bias in an algorithm that then has the appearance of being neutral.
The Upturn report explains that "criminologists have long emphasized that crime reports, and other statistics gathered by the police, are not an accurate record of all the crime that occurs in a community; instead, they are partly a record of law enforcement's responses to what happens in a community" [emphasis added].
This is a critical point: The police response to low-income communities -- in particular communities of color -- is completely different from the response to wealthy white communities. A recent GenForward poll shows that two-thirds of young African Americans and 40 percent of young Latinos and Latinas have either personally experienced violence or harassment at the hands of police or know someone who has. This is a big reason why predictive policing is so problematic; as last week's coalition statement notes, it's not an objective tool but one "engineered to support the status quo."
As my colleague Sandra Fulton observed in a recent post on surveillance of communities of color, police often "race to adopt new technologies without considering the potential harms or consulting with the communities they serve." Predictive policing is yet another example of this dangerous trend. And people's lives are on the line.