Quote:
Originally written by KernelKnowledge12:
Quote:
Originally written by silver harloe: but a "flag" is a bit. 0 or 1.
Only in computer engineering. In computer programming, "flag" means any variable that relates to a boolean value (although most programmers extend the meaning of the variable). This variable may be a bit, or even a 64 bit integer (though this would be stupid).
I'm aware you can waste space storing flags. I meant in terms of range of value, not in terms of how it is stored. a long long to store a flag might not be such a bad thing if you have, say, 64 flags. but for compiler/cross-platform compatibility I'd stick with 32 flags per int.
too bad BoA Editor is misusing the term, though. more and more bad knowledge being spread around