> but job reports data over recent years have noticeably shifted towards downward monthly revisions. Prior to the pandemic response, the graph [1] looks much more balanced with regards to positive and negative decisions
Yes. The reasons for this are well documented. Changing methodology for the preview estimates is rigorous. That means our published estimates lag best estimates, something the primary sources note in every release if one gets past the headlines.
Also, if you have one year of massive job gains and four years of flat and falling, you’ll spend most of your epoch biased one way. Again, not a sign of methodological problems. Just a predictable methodological artifact that folks are supposed to be able to incorporate before using, much less emotionally reacting to, the data.
Why would the shift to a new methodology bias the estimates to one end? I would expect a new methodology to make comparisons of data between the two systems to potentially be unhelpful, but I wouldn't expect a valid methodology to bias one way or another.
Related, I wouldn't expect past data to bias a current estimate. If 6 or 12 months of positive growth biases the next prediction it falls into the hot hands fallacy. It isn't predicting based on current predictions, its predicting based on recent past behavior and extrapolating forward. This only makes sense to do if the data is not yet available, and even then the extrapolation isn't a useful estimate of current conditions.