Kind of a weird question but let's say outside temp is 80f. Assume your thermostat activates when the inside temperature goes 1f above the current setting. Also assume your AC stops running when it hits the desired temperature. I know these all work slightly different but let's just assume this for the sake of this argument.
Scenario A
You come in the house and it's 80f. Set your thermostat to 75f and it runs for 15 minutes until it hits 75f. Then 10 minutes later it goes up to 76f so it runs again (say 5 minutes) until it goes back to 75f, at which point it turns on.
Scenario B
You come in the house and it's 80f. You set your thermostat to 70f and it runs for 30 minutes until it hits 70f. Then 10 minutes later it goes up to 71f so it runs again (say 5 minutes) until it goes back to 70f, at which point it turns off.
In scenario B, are you really only paying more for the initial run of 15 minutes vs. 30 minutes? Or does the fact that you have the thermostat set lower mean that it will raise 1f at a much faster rate than scenario A, causing the AC to trigger many more times throughout the day?
Anyone know any formula to determine how much quick it rises based on the current degree setting vs. outside temperature? This would be an interesting study, but I don't want to waste my $ to perform it...