SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
"Predictive policing" sounds good on paper. After all, what could go wrong with a data-based approach to law enforcement?
It turns out: plenty. That's why Free Press joined a broad coalition of civil rights, privacy and technology groups in sounding the alarm about how predictive policing reinforces racial bias.
"Predictive policing" sounds good on paper. After all, what could go wrong with a data-based approach to law enforcement?
It turns out: plenty. That's why Free Press joined a broad coalition of civil rights, privacy and technology groups in sounding the alarm about how predictive policing reinforces racial bias.
The Leadership Conference on Civil and Human Rights mobilized the coalition, which counts the ACLU, the Brennan Center for Justice, Color Of Change and the NAACP among the 17 signers. The statement released last Wednesday notes that "the data driving predictive enforcement activities -- such as the location and timing of previously reported crimes, or patterns of community- and officer-initiated 911 calls -- is profoundly limited and biased."
Indeed, a damning report from the tech consulting group Upturn, which surveyed the nation's 50 largest police forces, confirms this view. Upturn found "little evidence" that predictive policing works -- and "significant reason to fear that [it] may reinforce disproportionate and discriminatory policing practices."
Nearly all of the predictive-policing systems in use in the United States come from private vendors. The systems draw on existing crime data to forecast where future crimes might occur. The idea is that this knowledge will help police departments determine where to focus their law-enforcement activities.
While the idea of using data to direct police resources sounds like an effort to remove human bias from the equation, that isn't how it works in practice. In fact, predictive policing embeds police bias in an algorithm that then has the appearance of being neutral.
The Upturn report explains that "criminologists have long emphasized that crime reports, and other statistics gathered by the police, are not an accurate record of all the crime that occurs in a community; instead, they are partly a record of law enforcement's responses to what happens in a community" [emphasis added].
This is a critical point: The police response to low-income communities -- in particular communities of color -- is completely different from the response to wealthy white communities. A recent GenForward poll shows that two-thirds of young African Americans and 40 percent of young Latinos and Latinas have either personally experienced violence or harassment at the hands of police or know someone who has. This is a big reason why predictive policing is so problematic; as last week's coalition statement notes, it's not an objective tool but one "engineered to support the status quo."
As my colleague Sandra Fulton observed in a recent post on surveillance of communities of color, police often "race to adopt new technologies without considering the potential harms or consulting with the communities they serve." Predictive policing is yet another example of this dangerous trend. And people's lives are on the line.
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
"Predictive policing" sounds good on paper. After all, what could go wrong with a data-based approach to law enforcement?
It turns out: plenty. That's why Free Press joined a broad coalition of civil rights, privacy and technology groups in sounding the alarm about how predictive policing reinforces racial bias.
The Leadership Conference on Civil and Human Rights mobilized the coalition, which counts the ACLU, the Brennan Center for Justice, Color Of Change and the NAACP among the 17 signers. The statement released last Wednesday notes that "the data driving predictive enforcement activities -- such as the location and timing of previously reported crimes, or patterns of community- and officer-initiated 911 calls -- is profoundly limited and biased."
Indeed, a damning report from the tech consulting group Upturn, which surveyed the nation's 50 largest police forces, confirms this view. Upturn found "little evidence" that predictive policing works -- and "significant reason to fear that [it] may reinforce disproportionate and discriminatory policing practices."
Nearly all of the predictive-policing systems in use in the United States come from private vendors. The systems draw on existing crime data to forecast where future crimes might occur. The idea is that this knowledge will help police departments determine where to focus their law-enforcement activities.
While the idea of using data to direct police resources sounds like an effort to remove human bias from the equation, that isn't how it works in practice. In fact, predictive policing embeds police bias in an algorithm that then has the appearance of being neutral.
The Upturn report explains that "criminologists have long emphasized that crime reports, and other statistics gathered by the police, are not an accurate record of all the crime that occurs in a community; instead, they are partly a record of law enforcement's responses to what happens in a community" [emphasis added].
This is a critical point: The police response to low-income communities -- in particular communities of color -- is completely different from the response to wealthy white communities. A recent GenForward poll shows that two-thirds of young African Americans and 40 percent of young Latinos and Latinas have either personally experienced violence or harassment at the hands of police or know someone who has. This is a big reason why predictive policing is so problematic; as last week's coalition statement notes, it's not an objective tool but one "engineered to support the status quo."
As my colleague Sandra Fulton observed in a recent post on surveillance of communities of color, police often "race to adopt new technologies without considering the potential harms or consulting with the communities they serve." Predictive policing is yet another example of this dangerous trend. And people's lives are on the line.
"Predictive policing" sounds good on paper. After all, what could go wrong with a data-based approach to law enforcement?
It turns out: plenty. That's why Free Press joined a broad coalition of civil rights, privacy and technology groups in sounding the alarm about how predictive policing reinforces racial bias.
The Leadership Conference on Civil and Human Rights mobilized the coalition, which counts the ACLU, the Brennan Center for Justice, Color Of Change and the NAACP among the 17 signers. The statement released last Wednesday notes that "the data driving predictive enforcement activities -- such as the location and timing of previously reported crimes, or patterns of community- and officer-initiated 911 calls -- is profoundly limited and biased."
Indeed, a damning report from the tech consulting group Upturn, which surveyed the nation's 50 largest police forces, confirms this view. Upturn found "little evidence" that predictive policing works -- and "significant reason to fear that [it] may reinforce disproportionate and discriminatory policing practices."
Nearly all of the predictive-policing systems in use in the United States come from private vendors. The systems draw on existing crime data to forecast where future crimes might occur. The idea is that this knowledge will help police departments determine where to focus their law-enforcement activities.
While the idea of using data to direct police resources sounds like an effort to remove human bias from the equation, that isn't how it works in practice. In fact, predictive policing embeds police bias in an algorithm that then has the appearance of being neutral.
The Upturn report explains that "criminologists have long emphasized that crime reports, and other statistics gathered by the police, are not an accurate record of all the crime that occurs in a community; instead, they are partly a record of law enforcement's responses to what happens in a community" [emphasis added].
This is a critical point: The police response to low-income communities -- in particular communities of color -- is completely different from the response to wealthy white communities. A recent GenForward poll shows that two-thirds of young African Americans and 40 percent of young Latinos and Latinas have either personally experienced violence or harassment at the hands of police or know someone who has. This is a big reason why predictive policing is so problematic; as last week's coalition statement notes, it's not an objective tool but one "engineered to support the status quo."
As my colleague Sandra Fulton observed in a recent post on surveillance of communities of color, police often "race to adopt new technologies without considering the potential harms or consulting with the communities they serve." Predictive policing is yet another example of this dangerous trend. And people's lives are on the line.