I still have a hard time buying that argument that "the weather" could cause footballs to deflate as much as they allegedly did.
I'm going to try to avoid fratching here, and just lay out some facts, as best as I can:
Ok, first the Ideal Gas Law formula is defined as:
PV = nRT
Where P = Pressure, V = Volume, n = amount of gas, R is a constant, and T is the temperature.
1 PSI = 6.89475729 kPa
Constant R (gas constant) = 8.3144621 J/mol*K
Football volume (in Liters): 4.237
Standard Temperature: 273.15K (0º C, 32º F)
Standard Pressure: 99.973980705 kPa (14.5 PSI)
First, using the standard temperature and pressure measurements above, we have to find out how much air (n) is in the ball.
That formula is set up like this:
n = PV/RT
So in this case, let's assume a gauge pressure of 13 PSI (89.63184477 kPa). Also assuming standard temperature and using the volume of the football and the constant R, we get:
n = ( 89.63184477 * 4.237)/(8.3144621 * 273.15)
So the value for n is approximately 0.16721893
To make sure everything equals out, we plug back into the standard equation:
PV = nRT
(89.63184477 * 4.237) = (0.16721893 * 8.3144621 * 273.15)
The values are close. We end up with:
379.77012629049 = 379.77012977568510195
So if we round to 5 decimals, we get 379.77013 = 379.77013.
So, that day in Foxborough, there was a game-time temperature of 51ºF (283.706 K). For the purposes of this exercise, we will assume that we are at standard atmospheric pressure.
Using this information, we must now do a new calculation:
P = (0.16721893 * 8.3144621 * 283.706)/4.237
P = 93.09570707 kPa (or 13.50239077 PSI)
So between the two gauge pressures, there is a difference of 3.46386148 kPa, or basically an increase in a little more than ½ PSI with a 19ºF (10.556 K) increase in temperature.
Now, if we assume the balls were inflated in a room that is 72ºF (295.372 K), we get:
P = (0.16721893 * 8.3144621 * 295.372)/4.237
P = 96.92380559 kPa (14.05760952 PSI)
So assuming a 21ºF (11.666 K) change in temperature from where the balls were inflated to the outdoor ambient temperature, there's a PSI difference of 0.55521875.
Now that I've bored you completely to tears...it would appear that, assuming a 72ºF room temperature inflation and a game time temperature of 51ºF, the most the balls could have “deflated” by game time is roughly 0.6 PSI.
According to this site:
http://www.wcsh6.com/story/weather/2...roll/22065861/
The barometric pressure that day was 1009.5 mb.
So plugging that in, I got a "gauge pressure" reading of 12.08 PSI, after I started at the "low end" of 12.5 PSI.
A drop of .42 PSI.
Thoughts, everyone?
I'm going to try to avoid fratching here, and just lay out some facts, as best as I can:
Ok, first the Ideal Gas Law formula is defined as:
PV = nRT
Where P = Pressure, V = Volume, n = amount of gas, R is a constant, and T is the temperature.
1 PSI = 6.89475729 kPa
Constant R (gas constant) = 8.3144621 J/mol*K
Football volume (in Liters): 4.237
Standard Temperature: 273.15K (0º C, 32º F)
Standard Pressure: 99.973980705 kPa (14.5 PSI)
First, using the standard temperature and pressure measurements above, we have to find out how much air (n) is in the ball.
That formula is set up like this:
n = PV/RT
So in this case, let's assume a gauge pressure of 13 PSI (89.63184477 kPa). Also assuming standard temperature and using the volume of the football and the constant R, we get:
n = ( 89.63184477 * 4.237)/(8.3144621 * 273.15)
So the value for n is approximately 0.16721893
To make sure everything equals out, we plug back into the standard equation:
PV = nRT
(89.63184477 * 4.237) = (0.16721893 * 8.3144621 * 273.15)
The values are close. We end up with:
379.77012629049 = 379.77012977568510195
So if we round to 5 decimals, we get 379.77013 = 379.77013.
So, that day in Foxborough, there was a game-time temperature of 51ºF (283.706 K). For the purposes of this exercise, we will assume that we are at standard atmospheric pressure.
Using this information, we must now do a new calculation:
P = (0.16721893 * 8.3144621 * 283.706)/4.237
P = 93.09570707 kPa (or 13.50239077 PSI)
So between the two gauge pressures, there is a difference of 3.46386148 kPa, or basically an increase in a little more than ½ PSI with a 19ºF (10.556 K) increase in temperature.
Now, if we assume the balls were inflated in a room that is 72ºF (295.372 K), we get:
P = (0.16721893 * 8.3144621 * 295.372)/4.237
P = 96.92380559 kPa (14.05760952 PSI)
So assuming a 21ºF (11.666 K) change in temperature from where the balls were inflated to the outdoor ambient temperature, there's a PSI difference of 0.55521875.
Now that I've bored you completely to tears...it would appear that, assuming a 72ºF room temperature inflation and a game time temperature of 51ºF, the most the balls could have “deflated” by game time is roughly 0.6 PSI.
According to this site:
http://www.wcsh6.com/story/weather/2...roll/22065861/
The barometric pressure that day was 1009.5 mb.
So plugging that in, I got a "gauge pressure" reading of 12.08 PSI, after I started at the "low end" of 12.5 PSI.
A drop of .42 PSI.
Thoughts, everyone?
Comment