If you as a video game producer could suddenly and dramatically expand your audience to 10x what you currently have for developing video games……
All you had to do was optimize your own code to work on older machines……..
Would you do it?
I spent 30 years in programming.
The first game I bought was Temple of Apshai for my Atari 600xl.
I found out the hard way it required 64k of RAM, where my dinky little machine only had 16k, which brought me to do some research via FTP, Arpanet and Bulletin Board Services, where I learned how to disassemble running machine code, and after I learned how to do that, all I had to do was find a ‘check’ for a certain amount of memory – and place something called a “Jump To Subroutine” on the system check for amount of memory required for the game, and “Boom” the game runs.
I found out the game played quite well for about an hour.
But after some time it would begin to slow down.
Then it would altogether crash.
So I learned to save often, save frequently,
As I explored what was going on with the game, I learned memory requirements were not because the game actually did not technically need anything but 16k of RAM.
And when it crashed, it crashed because the game had been developed in a way which leveraged the memory quite poorly.
Put specifically: When the developers released a poorly coded product with higher system requirements because they poorly managed their resources, this was to cope with a lack of comprehensive testing for the end product.
So what would happen is – for this dungeons and dragons style game – I would go into areas and memory would be allocated for the resources in that area. It wasn’t a lot, say 1k, but when I went to another area, the programmers did not deallocate the memory. So wash, rinse repeat this for 10 areas, and I experienced a slowdown because the system’s now trying to find limited resources. 16 areas and it’s slower than a crawl and starting to have errors as there is no memory left available. And as I enter the 17th area, the game crashes, hard – and freezes the system, requiring a total reboot.
In a nutshell, the programmers could have resolved the issues through object oriented programming, or doing procedurally based code which would be called on entrance and exit from every area.
Had the game gone through Quality Assurance and testing. This would have been caught. But instead. The developers, thinking their code was perfect, may have placed this on a lower memory system and assumed their code isn’t bad, there’s just not enough memory to run the game.
Modern Simulaton/Game/Alternate Reality/Virtual Reality Engines such as Cryengine 3.0, Unreal Engine 4. 0 – even Source’s SDK make the same mistake. Leveraging legacy code that’s not fully understood, and developers are implementing code which is built on high machines for high end machines. I know. I made this mistake at one time..
But one of the most important things of working in a corporate environment is understanding the vast majority of users leverage Operating Systems and users which are – according to the providers of this software – deprecated and/or nearing obsolescence.
Now this introduces an interesting dichotomy for game sales – as game income is actually artificially constrained by the development mentality.
Why is this? Pretty simple. Most home users own PCS are averaging equipment that’s running 2 to 4 gig of RAM, with 5400rpm drives and Intel accelerated video cards and 1.2ghz CPUs with dual cores.
Rewind time to 2003, where this was an optimal system requirement – where OpenGL for graphics and high profile custom caching mechanisms built into the core of the engines built were the norm.
Magnificent high resolution games came out and flipped the industry on its head. Games such as Grand Theft Auto: Vice City. Games such as Worlds of Warcraft. And way back in 1999 – Everquest.
So now. Like Microsoft’s Operating System post Windows XP. Features are added to these engines but a total lack of understanding and ‘lessons learned’ of memory management and caching has moved forward.
But games are different.
As I see wonderful works of art like Star Citizen.
Like No man’s land.
Like Assassin’s Creed 4+.
Or try to turn the graphics up for Star Trek Online…. (which is a personal favorite despite how utterly profane this game is)
I pick up these games. I look at the system specs.
I look at Youtube and marvel at graphics I will never see as a homeless man sleeping in a tent with only a 4gig machine running an AMD 1.3ghz processor.
I can’t but feel like I am missing out.
I would like to say “It’s a good thing I have 10 thousand other games to choose from”
But I can’t.So I pull up my 1999 video game called Everquest.
I turn the resolution up on it.
And pretend it looks as good as the new games do.
I mean. I know it’s possible to make these games look fantabulous on these older machines.
If the developers nowadays simply took a step back and maybe hired a few of us old fart programmers who are living off the street who love playing the things they create.
That or if they tested their own games and started figuring out the memory management and caching issues I struggled with when I first learned how to program which every developer nowadays seems to take for granted.
If you could suddenly expand your audience to 10x what you currently have.
And increase revenues accordingly.
Wouldn’t the QA and testing and development revisions pay for itself?
Seems like common sense to me…
Refactor, refactor, refactor