What the New CDC Report on Unvaccinated Left Out
A newly published CDC study prompted the MSM to run headlines like "New study finds unvaccinated are 11 times more likely to die from Covid, CDC says," "Unvaccinated people were 11 times more likely to die of covid-19, CDC report finds," and "Unvaccinated People Are 11 Times More Likely To Die Of COVID-19, New Research Finds." It surely looks scary if you read only those headlines. I was curious about what the report actually says, and doesn't say. Below is what I found.
- The purpose of the report is for propaganda
The report admits in its Discussion section that "the data assessed from 13 jurisdictions accounted for 25% of the U.S. population, and therefore might not be generalizable." But it still suggests that "[t]he data might be helpful in communicating the real-time impact of vaccines (e.g., persons not fully vaccinated having >10 times higher COVID-19 mortality risk) and guiding prevention strategies, such as vaccination and nonpharmacologic interventions." No wonder the MSM outlets quickly spread this message around.
It could be dangerous to continue the mass vaccination campaign that, at least, helped to drive the delta variant becoming the predominant virus in the United States today. The report admits that "[f]indings from this crude analysis of surveillance data are consistent with recent studies reporting decreased VE [vaccine effectiveness] against confirmed infection." If the CDC keeps up this mass vaccination campaign to continue to drive the delta variant out, what are we going to do when the next, more dangerous variant comes?
- The representation of the data used in the study
The raw data used by the study were collected from thirteen jurisdictions, including "Alabama, Arizona, Colorado, Indiana, Los Angeles County (California), Louisiana, Maryland, Minnesota, New Mexico, New York City (New York), North Carolina, Seattle/King County (Washington), and Utah." That accounts for 25% of the U.S. population. Interestingly, the study did not include the seven states with the lowest partial vaccination rates. Those states could be a reference to compare. The CDC study itself is not so confident about the data representation; "therefore[, it] might not be generalizable." Well, the MSM are already generalizing, using the misleading numbers published by the CDC.
Fitting a model to data means choosing the statistical model that predicts values as close as possible to the ones observed in your population.
I was curious about how the raw data were manipulated by the study to generate the authors' conclusion, which says, "[A]fter Delta became the most common variant, fully vaccinated people had reduced risk of 5X infection, >10X hospitalization, >10X death."
First, let's look at the data used by the study. In the report, it says, "Two analysis periods, April 4–June 19 and June 20–July 17, were designated, based on weeks with <50% or ≥50% weighted prevalence of the SARS-CoV-2 Delta variant for the 13 jurisdictions."
I happened to count the days in those two periods: 80 days and 30 days. The question is why those two periods were selected. The middle point, June 19, has a specific meaning, as explained in the report when the delta variant reached a threshold of >50%. But why did the study choose 2.7 times more days for the first period? That certainly raises a model fitting suspicion. Below is a screen shot of a diagram in the report, Figure 1:
Figure 1.
The blue line represents June 19. If we also select 30 days for the first period, it would be started on May 20. From the diagram, we can see the total number of infection cases for the unvaccinated people in the first period would be much smaller because the trend was declining rapidly. It may be below 100 (the black line). The blue line that represents the cases for the vaccinated people did not change too much.
- The calculations
This would be most difficult to discuss here because it's very technical. Let me show the raw summary data in the report. The first diagram includes two parts of summary data, Table 1. The "Totals" row has combined data from both periods. The second part has the data for period 1, April 4–June 19. The second diagram has the data for period 2, June 20–July 17. I left out some data about average weekly incidence (per 100,000 population) and average weekly IRR (95% CI). Those data sets are based on estimates using predefined formulas. If you follow those formulas, you sure will get the same conclusions as the report did.
But I want to look at the raw data to see something the report does not say. Let's look at the Cases, the Hospitalizations, and the Deaths data, which can be analyzed in the same way as the Cases.
The "not fully vaccinated" case number in the first period is 467,509, which represents 95% of the total cases during that period. The "fully vaccinated" case number is 23,503, which represents 5% of the total cases. Similarly, the "not fully vaccinated" case number in the second period is 101,633 which represents 82% of the total cases. "Fully vaccinated" is 22,809, which represents 18% of total cases. Immediately, I can tell that the percentage for "fully vaccinated" increased a lot in the second period, from 5% to 18%! This is similar in Hospitalizations, from 7% to 14%, and in Deaths, from 8% to 16%. Does that mean that the effectiveness of vaccines declined 50% or more when the delta variant crossed a threshold of >50%?
Table 1.
Table 2.
Now, let's go back to discuss what would happen if we shortened the number of days to 30 for Period 1. I don't have the access to the real data sets. I'm just doing an estimate to show the point. I modified the data for the first period, April 4–June 19, and divided each number by 3, because the number of days is about 2.7 times. In early April, there were more cases, hospitalizations, and deaths.
The highlighted two columns are the percentage decline of the data in the second period from the first period. For example, the "unvaccinated" cases declined: (133,574-101,633)/133,574 = 23.91%.
Now you can see that under the impact of the delta variant in the second period, the number of cases for the "not fully vaccinated" people is reduced. So are the hospitalizations and deaths. This was the same if I didn't modify the days. But the "vaccinated" numbers increased by a lot — 191.14% for cases, 40.89% for hospitalizations, 31.78% for deaths. That's terrible! Is that why the study had to use 80 days? To avoid this bad look?
Image: qimono via Pixabay, Pixabay License.
To comment, you can find the MeWe post for this article here.