Behind Closed Doors

Amalia Miller

In 2020, we worried that COVID lockdowns might lead to an increase in domestic violence. Instead, the opposite occurred. Why did this happen — and why was it so hard to figure out?

Asterisk: Early in 2020, a media narrative coalesced around the idea that COVID lockdowns were likely to lead to an increase in domestic violence. In a series of papers, you and your co-authors Carmit Segal and Melissa Spencer investigated this issue. You found that it’s likely the opposite happened: Domestic violence actually decreased during the early months of lockdowns. Where did that initial idea come from and how did you and your co-authors reach a different conclusion? 

Amalia Miller: In very early 2020, before COVID broke out in the U.S., there was anecdotal evidence emerging that domestic violence had increased in China as a result of their lockdowns. And so there was a lot of concern from advocacy groups that something similar might happen in the U.S. I think that those were legitimate and important concerns. 

As various locations in the U.S. began to instate lockdowns, both anecdotes and some very early data started to get coverage in the press. Sometimes these were contradictory. Some of those early stories might report that calls to a hotline in one city increased by 40% but decreased in another. 

Beyond that, in some sense, whatever you found, it was possible to interpret it as pointing to increased domestic violence. There were concerns both about the fact that violence might increase and then a separate concern that domestic violence victims might be “trapped” with their partner in the home and therefore less likely to call for help. So some of the stories that nevertheless reported a decrease in calls suggested the reason wasn’t that violence was going down, but that reporting was being depressed.  

I think that if you looked carefully at initial reports, they weren’t all showing that the indicators went up. But a lot of the narrative interpreted it that way, no matter if the measure went up or down. 

The challenge in measuring crime is that crime is not universally observed nor universally reported. Any crime measure we have is filtered through the lens of what gets reported to our data source. That issue applies to a lot of crime and, unfortunately, a lot of crime never gets reported or recorded. And it always applies to domestic violence.

There were two measures that we looked at in our initial work. There are calls for service — “dispatches” or “911 calls.” Basically, someone calls the police, they report a crime, that gets logged, and there’s a file. Some departments make that data available. Then the other measure that police sometimes make available is a measure of “criminal incident reports.” These are filed by an officer who has looked into the call. If they think that a crime likely happened, then they make a report. This is in some sense a more validated, or at least more narrow measure. So we look at both. And what we found is that in police departments that had both of these measures available, there are opposing patterns.

A: Calls went up and criminal reports went down.

AM: Yes, for the most part, calls went up and criminal reports went down. 

And, in fact, that does square with some of the earliest academic studies, which found that calls were increasing — so I think that early media narrative was validated. People held that expectation based on the initial concern. 

But we were trying to figure out whether calls and criminal reports are measuring the same thing. There was also the fact that studies were coming out of different cities. So we said, “Let’s look at the same set of cities and determine whether it’s about the measures or it’s about the locations.” Because it could be that crime went up in one city and down in another because of differences in how the pandemic affected those populations. So our first step was just fixing the cities. In our paper in the Journal of Urban Economics, we looked at 18 cities and we found that, across our sample, calls go up during shutdowns and domestic violence assaults go down.

A: One of the things you did to understand this data better was go back to the months preceding March and April 2020, when lockdowns start. And what you found was that the increase in domestic violence calls had actually begun before the initial lockdown. 

AM: Right. We tracked data from the start of 2020 up to when the first cities started reopening. But we don’t just compare before the lockdown with after the lockdown starts because there is a seasonality in domestic violence. In the 18-city paper, we use the seasonal variation between January, February, March, April, and May from the 2019 data to account for the seasonality, and then we look at how that seasonal effect is different in 2020, in order to estimate the effect of lockdowns. 

Trends for (A) police reports of DV assault crimes and (B) DV service calls to police. Daily trends were calculated as the 7-day moving average of daily records, aggregated across cities, per 100,000 total population served. The dashed vertical line on March 14 indicates the date after the nationwide emergency declaration and the solid vertical lines indicate the dates of city shutdowns. Source: Effects of COVID-19 Shutdowns on Domestic Violence in US Cities, Amalia R. Miller, Carmit Segal, Melissa K. Spencer

A: So if journalistic outlets were only looking at an increase in calls in the first month of lockdowns, they would have seen it. But unless they were looking longitudinally, and were aware of the seasonal patterns, they would not have necessarily caught that that was part of that same seasonal trend.

AM: Right. If you didn’t consider that domestic violence tends to be higher in March, April, and May than in January and February, you might have just attributed that increase in 2020 to lockdown.

A: Do you have any idea why it’s seasonal?

AM: We’ve thought about this a lot and we still don’t know exactly. There’s evidence that associates crime with increases in temperature, so maybe when it’s colder there’s less violence. But I don’t know how that data, which is mostly on street crime, carries over to crime in the home.

A: So apart from that, what underlying pattern do you think explains the data you found? 

AM: Okay. What we end up seeing across these 18 cities is calls to police going up, but the police are recording fewer crimes. So we need to have a way to reconcile that. 

There are two major stories that we think could explain it — though we end up favoring one of them. 

One possibility is that the increase in calls was coming from a higher reporting rate to the police. Essentially, that the pattern we were seeing reflected more calls to the police per actual incident of domestic violence in the population than normal. If calls aren’t connected to real incidents of domestic violence, that could explain the pattern where calls are increasing but crime is decreasing. 

Now, why would calls go up? There are two reasons we mention. 

One is the increased attention on domestic violence. Public-health messaging contained hotlines and numbers to call; it encouraged people to look out for domestic violence and to seek help. There was also additional funding. So greater attention to the issue might have led to higher reporting rates. 

It’s also possible that because people were home more because of shutdowns, they were more likely to overhear things that were happening with their neighbors — or just more likely to call the police. And some of those calls might have been associated with incidents where if it’s coming from what we call a “third party” — someone who’s not the perpetrator or the victim — such as a neighbor, they may not have full information. And so they might call the police to report a domestic violence incident, and the police find that it’s something else — a verbal dispute but not a physical fight, for example, or a physical fight that isn’t involving two domestic partners. 

Ultimately, we’re not able to look at who the callers were in the data from the U.S. There is a study in greater London that did have police data on calls and found that the increase in calls in London during their lockdown was driven entirely by third-party callers. So that was part of why we thought that might be what was going on. 

So our thinking was: If the neighbors are calling the police more because of nuisance and noise, would we see that in other measures, such as nuisance calls? We found an increase in general disturbance and noise calls also associated with the shutdowns, which is consistent with the general pattern in which neighbors are complaining more about noise. 

And finally, for some of the cities, we had data that let us distinguish between calls about physical violence and calls about verbal disputes. There, we find that the statistically significant increase in calls is coming from calls that are coded as being verbal disputes, and that there wasn’t a significant increase in calls coded as being physical violence.

A: So all of this points to a scenario in which more people are home, hearing more noises, and because the possibility of domestic violence is quite salient to them, they’re placing more calls.

AM: Yes. That’s all consistent with that story. It doesn’t prove it, but it supports it. 

So then what we did was ask, “What’s the other story?” If it isn’t coming from an increase in calls, the other story is potentially that crime didn’t go down, and this increase in calls actually did reflect more violence. Then you have this other explanation that has to answer the question of why recorded crime went down if actual crime was going up.

The story that has been offered is that it was coming from police under-response, or police shirking of these cases. And let me be more precise. There’s some baseline level at which police are going to attend to domestic violence cases. The idea is that because of the pandemic, police just put less effort into investigating these cases; maybe because they didn’t want to get infected, or maybe because they didn’t want to arrest people because of COVID outbreaks in jails. In that story, the police are under-recording crime. So they get called in and they don’t investigate, or they investigate but don’t decide to file a report. 

But we did not find support for that interpretation in the data.

A: You look at reports for other assaults, like other violent crimes, and you find that other violent assaults were also down during this period, including crimes that are very hard to miss.

AM: We do find a decrease in aggravated assaults, too. And so if it’s a story about the police not wanting to write a report, you would expect that would only hold true for less serious crimes. But for aggravated assault, you would expect the police to file a report because that’s not a marginal crime. That’s a very serious crime. And so because we saw reports decreasing in both cases, we thought it was unlikely that shirking could explain a reduction there. 

Another thing we tried to see was if there were any statements made by police departments saying that they changed their policies about domestic violence calls. And we don’t see that. In fact, we find statements saying, “Domestic calls are in our high-priority category.” 

So the police are at least saying, and their official policies are saying, that they’re not treating this any less seriously, which is what you’d expect them to say. So then what we did empirically (and we could only do this for five of our cities out of the 18 in our sample where we had data and the time) was measure the amount of time it took for the police to respond to the 911 call. 

What we thought was, “Well, if the police were investing less effort, if they were kind of dragging their heels on these cases or taking them less seriously, then we would expect the response times to go up.” And in fact, we find that they go down during the shutdown. 

A: You triangulated the data you were seeing from calls and from police in a few other ways — for example in national surveys. Can you speak about that? 

AM: So in the 18-city paper, we looked at a few other sources for triangulation. One source is federal data. Another way to measure domestic violence, which we’ve used in prior work, is to use the proxy measure of suicides. Suicides are not precisely linked to domestic violence by any measure, but they are potentially related to it. And we find no increase in suicides in the lockdown quarter relative to the previous quarter relative to the baseline trends in 2019. 

Another strategy, with both pros and cons, is to use survey data. The National Crime Victimization Survey (NCVS) run by the Justice Department is a household survey of individuals. It asks about experiences of crime and victimization. Here, the main benefit is that the survey asks about crimes that may or may not have been reported to the police, so it provides a metric that isn’t dependent on either people calling the police or police deciding what to record. 

It has its own limitations because people have to be willing to report crime to the survey — and because it’s a sample of 50,000 households that represent the country, it’s not a comprehensive view, and there aren’t necessarily huge numbers of crimes in the data. We found from that data a relative decrease in intimate-partner violence. 

The other source of data we used is the UCR, or Uniform Crime Report System, which records intimate-partner homicides

Death and nonfatal crime rates from pre-pandemic and pandemic shutdown months in 2020 and from the same months in 2019.
Source: Effects of COVID-19 Shutdowns on Domestic Violence in US Cities, Amalia R. Miller, Carmit Segal, Melissa K. Spencer

A: The Uniform Crime Report is produced when the FBI aggregates administrative data on crimes from all over the country and then publishes a national report.

AM: Yes.

A: You have another paper that looks at the effect that female law enforcement officers have on crime. You found that as the percentage of women in a police department increases, you see increased reporting on domestic violence and higher police quality by some measures. Do you see a connection between that paper and this work? Is there any evidence that the early media focus helped to head off that crime? 

AM: That paper definitely provided the background and motivation for this work. It formed part of our framework for thinking about these measurement issues. When COVID hit and there was all this press attention about domestic violence, we were concerned as humans, but we were also concerned as researchers about the difficulty of measuring whatever would happen.

In terms of the second point about whether we think that the additional resources and attention may have led to increased calls and that maybe this could be why the crimes went down — I do think so. It’s speculative. We can’t prove it. 

But we think that’s an important part of our interpretation and why these results are important. In some sense, does it matter if people have the wrong narrative about what happened? I think it does, beyond the reason that, as scientists, we want to know the truth.

Policymakers who are focused on the effects of different containment measures in the pandemic should want to know what impact this increased attention had. We should want to know whether all of these measures had an effect. If domestic violence went up despite this increased attention, that would suggest these measures don’t help. 

To give an illustration, imagine that additional calls were coming from verbal disputes, not domestic violence crimes. It could be that having the police show up for a verbal dispute has a deterrent effect by reducing escalation and thereby reducing crimes. The fact that there was this significant decrease in domestic violence suggests that there are potential opportunities to reduce it. We don’t need a pandemic to do it, we need extra attention to the issue. It seems like maybe the pandemic motivated this extra attention and maybe that extra attention would be useful after the pandemic, too. And I think that if we don’t pursue the data and figure out what really happened, then we might miss that conclusion.

A: That’s a good segue to talk about crime measurement more broadly. I am interested in some of the issues with the National Crime Victimization Survey, such as discrepancies between administrative data and survey data, and how the NCVS sometimes has estimates that are very different from others. For instance, rates of rape described in the NCVS are much lower than in the National Women’s Survey, or other surveys where questions are worded a bit differently. I’m curious if that is the pattern for domestic violence as well.

AM: I think that domestic violence is likely underreported in the NCVS relative to the true incidence. In my mind, I distinguish between the “crime type approach” and the “health type approach” because they define violence somewhat differently. The criminal justice approach is focused on what’s a crime; the health approach is broader, and might include more psychological and coercive instances. Because it’s broader, it’s going to pick up higher rates.

It’s also possible that different ways of framing the question lead to different responses. A health survey might make people more willing to answer certain questions about experiences they’ve had than having it in the context of a crime survey where you’re being asked about theft and other crimes.

The NCVS has put some effort into changing and redesigning questions. But in the study on female police officers, we couldn’t use the NCVS data on rape or sexual violence because of the way they ordered the questions. The reason was that they first asked whether you were the victim of an assault. Only then did they ask about the perpetrator and the nature of the assault. So if somebody was a victim of a sexual assault, they might not consider it “an assault” — and therefore wouldn’t be asked the follow-up questions. But if you asked if they were “raped” or “touched,” you might have gotten different responses. They changed that in the 1990s. The way we handled it in our paper was to basically just use the earlier version and cut off the time series because we couldn’t compare between the different versions of the surveys. 

Even after the redesign they were heavily criticized and tried to reform. However, they are still criticized, or at least they still find lower rates than other measures. So I don’t think that the redesign can necessarily address all of the issues.

A: I went down a rabbit hole when I was doing research for this on the history of the NCVS. In the ’80s, they did all this pilot research on attention, trying to figure out things like how long they could even make the survey, what order to put questions in so that people would answer them, and how to balance between wanting to record more crimes with the fact that are more errors in memory the longer back you ask people to go.  

AM: Yeah, I think it’s a flawed object. But it is also a really valuable resource, and a lot of thoughtful attention goes into these details. And then when you think about a crime like domestic violence, sometimes their focus is on a particular incident, but then they have ways of trying to account for the fact that it is often a serial crime. For example, if there were multiple incidents that are kind of similar and not necessarily distinguishable, how do you count that?

A: I’m curious if you as a researcher have a wish list of what you could change about how this data is recorded or collected. 

AM: We don’t go out and make a specific wish list for our paper, but we do have some complaints and desiderata for how departments report data. I think that standards could be beneficial. There have been some efforts to coordinate. But, right now, researchers are just kind of drawing what data they can — and we have no quality control or comparability. 

A: When you say you want quality controls for police reporting data, what does that look like? What are the problems that you run into when you’re working with this data in a granular way?

AM: The general point I’m making is that if there’s no checking, there could be errors. Not necessarily intentional errors, but there could be something that is, for instance, transcribed wrong. 

The other issue is the consistency of definitions and what actually gets reported. Granularly, what came up for us when we were trying to put together a database of multiple police departments is that different departments had different ways of coding an incident. In Los Angeles, they had up to five crimes per incident, and they would give the details on all five of them. Some cities would also include up to four crimes, but they would only provide details about the most severe crime. And so if you’re trying to identify a domestic violence incident, if the most severe crime wasn’t domestic, it might not appear in the data.

As a researcher, the hardest thing for me about the NCVS is that it’s difficult to access geographic detail. But the challenge with that is that it is just a survey. And even though it’s big, I think my dream would be for it to be bigger, because if you want to look at crime in rural areas or you want to look in smaller population areas, it’s just not going to be reliable for crimes that are not frequently reported.

A: Didn’t you have to submit Freedom of Information Act (FOIA) requests to get a lot of the data from police departments? It wasn’t as if it was just available online. You really had to go searching for it.

AM: Right. So we didn’t FOIA every department in the country. Some of the departments that make data available do so because they had been FOIAed in the past. And others had a local government that embraced the idea of open data and put something out there. But there’s no general requirement that they make data available.

A: So what you’re saying is we need a crowdsourced campaign of FOIA requests to all police departments across the country, so that it just pushes them to make data available.

AM: Well, I’m not against that. I do think that some researchers have taken advantage of either FOIA or lawsuits against police departments to get data and other insights they otherwise wouldn’t be able to. That seems like a limitation or a flaw in the system.

This figure depicts monthly trends in New York City for DV Felony Assaults (A) and
DV radio runs (B).
Source: Effects of COVID-19 Shutdowns on Domestic Violence in US Cities, Amalia R. Miller, Carmit Segal, Melissa K. Spencer

Amalia Miller is the Georgia S. Bankard Professor of Economics at the University of Virginia. Her current research examines topics related to gender and family economics as well as healthcare technology and data privacy.

Published October 2023

Have something to say? Email us at letters@asteriskmag.com.

Further Reading

Subscribe