Not just in this case, but in general, you want to give any particular variable the minimum scope necessary to do its job.1
There are, however, a few rare cases where it can make sense to give a variable a larger scope than strictly necessary. The typical reason (excuse?) is performance--if the variable is of a type that's extremely expensive to create and/or destroy, but very cheap to reuse and performance of that part of the code is a real bottleneck, it can sometimes be worth moving it out to a larger scope where it can (for example) be created only once before the loop starts and destroyed once after the loop ends, instead of having to create and destroy it every iteration through the loop.
In such a case, however, it's typically better to have a wrapper that acts a little like a singleton. It creates a single instance of the underlying (expensive) object when needed and uses that as long as needed. Then when you're completely done with it, that underlying instance is destroyed. I've only run into this a few times though, and the circumstances have varied enough that I can't give much more than rather hand-waving generalities about it as a general case.
Other than that, I'd generally prefer to avoid the while (true)
as well, except in the rather rare case that you really want a loop that never exits. In most cases you end up with a if (x) break;
(or something similar) somewhere inside the loop. In most cases you can move that loop exit criteria into the loop header where it belongs, so somebody reading the code can see the exit criteria for the loop without having to scan through its entire body looking for it.
With some care, this can also give some guidance about the intent of the loop, so when somebody's looking for a specific part of the code (i.e., most typical code reading) they can know to skip ahead across the entirety of the loop if it's unrelated to what they care about right now. With a while (true)
, they're pretty much stuck with reading through the body of the loop to divine its intent, thus typically wasting quite a bit of time on irrelevant code.
1. Yes, there are a few languages that make this choice problematic. As Lincoln points out in his answer, JavaScript is one of them. The answer, IMO, is not to write crappy code to suit those languages. The answer is to expunge such horrible languages from the face of the earth, and get on with life using languages that aren't such a mess.
a=foo
is a dead give away that this is not real code. I think this question would be ok on Programmers, but I'm not sure. If you decide to post there, be sure to check to make sure it's on topic first. – ckuhn203 Jun 12 at 18:00