Originally Posted by RagnarokCzD
Originally Posted by etonbears
so everyone's opinion here is just speculation.
Obviously ...
But even speculations should be based on something ... i mean, if you think something, you had to have some reason no?

If you want to claim that something would require additional testing, coding, UI changes, creatin additional choice ... you should have no problem to explain why do you believe that.

You dont "just think it" ... thats madness. O_o

Equally, if you have no visibility into how something is created, you cannot be sure that ANY change would have no impact, however "simple" you consider the change to be.

There are good general principles usually followed when designing software ( avoid hard-coded constants, use modular design with unit testing, always perform integration and regression testing for any change, and many more ). But for any given development, there may be reasons to consciously decide to ignore those principles.

Videogames are particularly prone to making unusual development decisions, partly because they are real-time software, which can add performance constraints to software design decisions, and partly because games are ephemeral and do not have a long support tail; there is little need to plan for future adaptability.

If I were to pick a single reason why any arbitrary change to game has an associated cost, it would be that every change SHOULD be adequately tested before being released. Judging by the bugginess of many games, this often does not happen smile