Interesting data Larry. To piggyback off your post, I compared some NOAA graphs for Charlotte vs RDU min temperatures, the results were not what I expected.
View attachment 5645
View attachment 5646
What does all this mean? Who knows but it sure is interesting to see the variability. Could it be that Charlotte has reached a point where the heat island affect is “baked in” and no longer continues to contribute to warming, thus the slight cooling in min temperatures? Perhaps this would then explain why Raleigh has a continued uptrend in min temps as it hasn’t hit that point yet (and Lumberton too) while a very rural area like Cherokee county has seen cooling? I also wonder what the raw data would show for these areas? Here’s the link I use to generate the above graphs in case you want to try it out.
https://www.ncdc.noaa.gov/cag/city/time-series
Thanks for this great link! I dug deep into these 1980-2017 numbers for Charlotte and Raleigh. First of all,
the reason Charlotte is -0.1 (and it is actually more like -0.06 rounded down to -0.1)
for lows is November being -1.1. Why are Nov.'s lows so steeply colder covering 1980 to 2017?
Because the first decade's 1985 and 86 were very mild while the last decade's 2008, 12, 13, and 14 were very cold. Otherwise, had Nov been flat, Charlotte's min trend for 1980-2017 would have actually been +0.04 instead of -0.1.
Month by month, here are the 1980-2017 changes per decade for Charlotte mins:
J: 0.0
F: +0.1
M: 0.0
A: +0.2
M: +0.3
J: 0.0
J: -0.3
A: -0.1
S: +0.1
O: -0.4
N: -1.1
D: +0.5
Annual : -0.1
Next, I looked at monthly 1980-2017 changes per decade for Charlotte maxes to see how they compare to the mins:
J: +1.0
F: +0.7
M: +0.6
A: +0.6
M: +0.5
J: +0.6
J: +0.2
A: +0.6
S: +0.6
O: +0.7
N: +0.4
D: +1.1
Annual : +0.6
Note that all 12 months' maxes warmed during 1980-2017 and each respective month was more + for the maxes than for the mins.
I then looked at monthly 1980-2017 min trends/decade for Raleigh to see how they compare to the Charlotte mins:
J: +0.9
F: +0.7
M: +0.7
A: +1.1
M: +1.3
J: +0.8
J: +0.4
A: +0.8
S: +0.9
O: +0.8
N: -0.2
D: +1.1
Annual : +0.8
First,
note that November was the only month with colder mins trends for Raleigh. Also, note that the 3 most + months (April, May, and Dec) were also the most + for Charlotte mins., which is intuitive.
There is a pretty good correlation when comparing Charlotte mins with Raleigh mins for each month but with Raleigh always being 0.6 to 1.2 warmer than Charlotte. The result is that Charlotte is -0.1 vs Raleigh's +0.8 for annual mins.
Raleigh's annual for maxes was +0.7. This is very similar to its +0.8 annual for mins, which is intuitive. This similarity for maxes and mins is a striking difference vs Charlotte, which was -0.1 for mins and +0.6 for maxes (i.e., mins and maxes very different). Also, note that the +0.6 for Charlotte maxes is similar to both the +0.7 (maxes) and +0.8 (mins) for Raleigh.
So, the pretty flat -0.1 for Charlotte mins really sticks out. If it were due to the UHI effect no longer contributing to warming, why did the Charlotte maxes still warm about as much as Raleigh maxes? Supposedly with GW excluding UHI, mins warm at least as fast as, if not faster than, maxes. So, the mins should have still warmed up just like the maxes did.
So, here's my question.
Did Charlotte's station move during 1980-2017 to a location with better radiational cooling while Raleigh's didn't? If so, that could explain why Charlotte mins were ~flat (cooling due to better radiation cancelled out by continued warming of mins due to GW) and would explain why Charlotte maxes as well as both Raleigh mins and maxes continued to warm (due to GW).