The 1987 game “The Last Ninja” was 40 kilobytes (twitter.com)

251 points by keepamovin 17 hours ago

YZF 15 hours ago

I was looking at a production service we run that was using a few GBs of memory. When I add up all the actual data needed in a naive compact representation I end up with a few MBs. So much waste. That's before thinking of clever ways to compress, or de-duplicate or rearrange that data.

Back in the day getting the 16KB expansion pack for my 1KB RAM ZX81 was a big deal. And I also wrote code for PIC microcontrollers that have 768 bytes of program memory [and 25 bytes of RAM]. It's just so easy to not think about efficiency today, you write one line of code in a high level language and you blow away more bytes than these platforms had without doing anything useful.

jnpnj 10 hours ago

Long ago working for a retail store chain, I made some excel DSL to encode business rules to update inventory spreadsheets. While coding I realized that their excel template had a bunch of cells with whitespace in them on row 100000. This forced excel to store the sparse matrix for 0:100000 region, adding 100s of Kb for no reason. Multiplied by 1000s of these files over their internal network. Out of curiosity I added empty cell cleaning in my DSL and I think I managed to fit the entire company excel file set on a small sd card (circa 2010).

eleveriven 5 hours ago

I think you're right about the waste, but I'm not sure it's entirely "accidental"... a lot of it is traded for different kinds of efficiency

HeyLaughingBoy 2 hours ago

It usually is. I try to think of these things not as "waste" but as "cost." As in, what does it cost vs. the alternative? You're using 40Gb of some kind of storage. Let's say it's reasonably possible to reduce that to 20Gb. What's the cost of doing so compared to the status quo? That memory reduction effort, both the initial effort, and the ongoing maintenance, isn't free. Unless it costs a lot less to do that than to continue using more memory, we should probably continue to use the memory.

Yeah, there may be other benefits, but as a first order of approximation, that works. And you'll usually find that it's cheaper to just use more memory.

hnlmorg 11 hours ago

Sure, if you don’t count safety features like memory management, crash handling, automatic bounds checks and encryption cyphers; as anything useful.

I do completely agree that there is a lot of waste in modern software. But equally there is also a lot more that has to be included in modern software that wasn’t ever a concern in the 80s.

Networking stacks, safety checks, encryption stacks, etc all contribute massively to software “bloat”.

You can see how this quickly adds up if you write a “hello world” CLI in assembly and compare that to the equivalent in any modern language that imports all these features into its runtime.

And this is all before you take into account that modern graphics and audio is bitmap / PCM and running at resolutions literally orders of magnitude greater than anything supported by 80s micro computers.

bayindirh 11 hours ago

Yes, but this doesn't prevent you from being mindful and selecting the right tools with smaller memory footprint while providing the features you need.

Go's "GC disadvantage" is turned on its head by developing "Zero Allocation" libraries which run blazingly fast with fixed memory footprints. Similarly, rolling your own high performance/efficient code where it matters can save tremendous amounts of memory where it matters.

Of course more features and safety nets will consume memory, but we don't have to waste it like there are no other things running on the system, no?

> And this is all before you take into account that modern graphics and audio is bitmap / PCM and running at resolutions literally orders of magnitude greater than anything supported by 80s micro computers.

This demo [0] is a 4kB executable. 4096 bytes. A single file. All assets, graphics, music and whatnot, and can run at high resolutions with real time rendering.

This is [1] 64kB and this [2] is 177kB. This game from the same group is 96kB with full 3D graphics [3].

[0]: https://www.pouet.net/prod.php?which=52938

[1]: https://www.pouet.net/prod.php?which=1221

[2]: https://www.pouet.net/prod.php?which=30244

[3]: https://en.wikipedia.org/wiki/.kkrieger

rigonkulous 10 hours ago

hnlmorg 11 hours ago

userbinator 10 hours ago

3form 11 hours ago

eviks 10 hours ago

pjc50 10 hours ago

I would also add internationalization. There were multi-language games back in the day, but the overhead of producing different versions for different markets was extremely high. Unicode has .. not quite trivialized this, but certainly made a lot of things possible that weren't.

Much respect to people who've manage to retrofit it: there are guerilla translated versions of some Japanese-only games.

> this is all before you take into account that modern graphics and audio is bitmap / PCM and running at resolutions literally orders of magnitude greater

Yes, people underestimate how much this contributes, especially to runtime memory usage.

teamonkey 10 hours ago

eviks 10 hours ago

> all contribute massively to software “bloat”.

Could you point to an example where those gigs were really "massively" due crash handling and bounds checks etc?

hnlmorg 10 hours ago

cyberpunk 10 hours ago

I implemented a system recently that is a drop in replacement for a component of ours, old used 250gb of memory, new one uses 6gb, exact same from the outside.

Bad code is bad code, poor choices are poor choices — but I think it’s often times pretty fair to judge things harshly on resource usage sometimes.

hnlmorg 10 hours ago

anthk 11 hours ago

Back the day people had BASIC and some machines had Forth and it was like

        print "Hello world" 
or

        ." Hello world " / .( Hello world )
for Forth.

By comparison, giving how they optimized the games for 8 and 16 bit machines I should have been able to compile Cataclysm DDA:BN under my potato netbook and yet it needs GIGABYTES of RAM to compile, it crazy that you need damn swap for something it required far less RAM 15 years ago for the same features.

If the game was reimplemented in Golang it wouldn't feel many times slower. But no, we are suffering the worst from both sides of the coin: something that should have been replaced by Inferno -plan9 people, the C and Unix creators and now Golang, their cousin- with horrible compiline times, horrible and incompatible ABI's, featuritis, crazy syntax with templates and if you are lucky, memory safety.

Meanwhile I wish the forked Inferno/Purgatorio got a seamless -no virtual desktops- mode so you fired the application in a VM integrated with the guest window manager -a la Java- and that's it. Limbo+Tk+Sqlite would have been incredible for CRUD/RAD software once the GUI was polished up a little, with sticky menus as TCL/Tk and the like. In the end, if you know Golang you could learn Limbo's syntax (same channels too) with ease.

hnlmorg 10 hours ago

shiroiuma 11 hours ago

>Sure, if you don’t count safety features like memory management, crash handling, automatic bounds checks and encryption cyphers; as anything useful.

>Networking stacks, safety checks, encryption stacks, etc all contribute massively to software “bloat”.

They had most of this stuff in the 1980s, and even earlier really. Not on your little 8-bit microcomputer that cost $299 that might have had as a kid, but they certainly did exist on large time-sharing systems used in universities and industry and government. And those systems had only a tiny fraction of the memory that a typical x86-64 laptop has now.

hnlmorg 11 hours ago

anthk 11 hours ago

rigonkulous 10 hours ago

The BASIC 10Liner competition wants you to know that there is a growing movement of hackers who recognize the bloat and see, with crystal clarity, where things kind of went wrong ...

https://basic10liner.com/

".. and time and again it leads to amazingly elegant, clever, and sometimes delightfully crazy solutions. Over the past 14 editions, more than 1,000 BASIC 10Liners have been created — each one a small experiment, a puzzle, or a piece of digital creativity .."

yurishimo 6 hours ago

That website seems to be gone now, unless it’s supposed to redirect to a sketchy German wix ad…

dfxm12 5 hours ago

mysterydip 7 hours ago

There was one time I was troubleshooting why an app used at a company would crash after some amount of time passed. Investigating the crash dumps showed it using 4GB of ram before it died, suspiciously the 32 bit limit of its application.

Turned out they never closed the files it worked on, so over time it just consumed ram until there wasn’t any more for it to access.

vinkelhake 14 hours ago

I grew up with and absolutely adore The Last Ninja series. I'm not going to comment on the size thing because it's so trite.

Instead - here's [0] Ben Daglish (on flute) performing "Wastelands" together with the Norwegian C64/Amiga tribute band FastLoaders. He unfortunately passed away in 2018, just 52 years old.

If that tickled your fancy, here's [1] a full concert with them where they perform all songs from The Last Ninja.

[0] https://www.youtube.com/watch?v=ovFgdcapUYI [1] https://www.youtube.com/watch?v=PTZ1O1LJg-k

dwd 7 hours ago

Reyn Ouwahand who composed The Last Ninja 3 with Fastloaders.

https://www.youtube.com/watch?v=0bobBcV4HcY

He also has a few nostalgia triggering covers of some Galway tracks.

https://www.youtube.com/watch?v=n7niD6i4020

https://www.youtube.com/watch?v=PTSUR3RHh9M

uxcolumbo 10 hours ago

R.I.P Ben. He was such a positive human being and encouraging you to do great things, even if you doubted yourself.

Here is a little clip of him from Bedroom to Billions: https://www.youtube.com/watch?v=aRsLOUYL3mk

kbenson 14 hours ago

The first time I ever heard The Glitch Mob I had such a clear memory of this games soundtrack come to mind that I mentioned it to my brother soon after (as it was his commodore and his copy of the game I was playing when I was young). I'm not even sure if the song I heard even sounds like the game soundtrack particularly closely, but the connection in my mind was very strong.

antisol 13 hours ago

I know exactly how you feel - The Way Out Is In (https://youtu.be/kqFqG-h3Vgk) heavily evokes video games for me

emil-lp 9 hours ago

Here's more from FastLoaders:

https://c64audio.com/pages/fastloaders

dspillett 9 hours ago

> isometric on the C64 with such an amazing level of detail - simply gorgeous

Or a convincing representation of that. A lot of old tricks mean that the games are doing less than you think that they are, and are better understood when you stop thinking “how do they do that” and “how are they convincing my brain that is what they are doing”.

Look at how little RAM the original Elite ran in on a BBC Model B, with some swapping of code on disk⁰. 32KB, less the 7.75KB taken by the game's custom screen mode² and a little more reserved for other things¹. I saw breathy reviews at the time and have seen similar nostalgic reviews more recently talking about “8 whole galaxies!” when the game could easily have had far more than that and was at one point going to. They cut it down not for technical reasons but because having more didn't feel usefully more fun and might actually put people off. The galaxies were created by a clever little procedural generator so adding more would have only added a couple of bytes (to hold the seed and maybe other params for the generator) each.

Another great example of not quite doing what it looks like the game is doing is the apparently live-drawn 3D view in the game Sentinel on a number of 8-bit platforms.

--------

[0] There were two blocks of code that were swapped in as you entered or self a space station: one for while docked and one for while in-flight. Also the ship blueprints were not all in memory at the same time, and a different set was loaded as you jumped from one system to another.

[1] the CPU call stack (technically up to a quarter K tough the game code only needed less than half of that), scratch-space on page-zero mostly used for game variables but some of which was used by things like the disk controller ROM and sound generator, etc.

[2] Normal screen modes close to that consumed 10KB. Screen memory consumption on the BBC Master Enhanced version was doubled as it was tweaked to use double the bit depths (4ppb for the control panel and 2bbp for the exterior, instead of 2bbp and 1ppb respectively).

le-mark 15 hours ago

Apparently this person is referring to the available ram on a Commodore 64. The media (data) on disk or tape was much more than that.

classichasclass 15 hours ago

Not much more. It all fits on a single side of a 1541 floppy. Even considering compression it couldn't be more than a couple hundred kilobytes.

https://csdb.dk/release/?id=99145

boomlinde 14 hours ago

It's not much, but relatively speaking it's much more.

dspillett 9 hours ago

I'd say up to a couple of hundred is much more than 40. Not a full decimal order of magnitude, but even without compression the 170KB on one side is up to 4½×.

cubefox 3 hours ago

> Not much more. It all fits on a single side of a 1541 floppy.

It could still be much more depending on how much data fits on a single side of a 1541 floppy.

chorlton2080 12 hours ago

You can access nearly 64kb of RAM on the C64, if you don't need the BASIC or Kernal (sic) ROMs. They can be software toggled in or out. Agreed that even the tape had more game data than that, but not much more.

pjc50 10 hours ago

However, very few tapeloader games ever tried to load more assets from tape. Generally it would just load a memory image and that would be that for the entire game.

eleveriven 5 hours ago

But that's also kind of what makes it impressive in a different way. Even if the game was larger on disk/tape, they still had to stream it in tiny chunks and make it run within those constraints

DrBazza 10 hours ago

If we're talking about fitting a quart into a pint pot, it would be remiss not to mention Elite fitting into a BBC Model B, 32kb, and the excellent code archaeology of it, and variants by Mark Moxon here: https://www.bbcelite.com/

rigonkulous 10 hours ago

A multi-level generative dungeon-crawler in 10 lines of code:

https://bunsen.itch.io/the-snake-temple-by-rax

We lost something in the bloat, folks. Its time to turn around and take another look at the past - or at least re-adjust the rearview mirror to actually look at the road and not ones makeup ..

nacozarina 8 hours ago

Gluecode-First Engineering: the free-love utopia of sharing code resulted in engineers abandoning whole-design and defaulting to just creating mash-ups of pre-existing code.

Nobody designs whole-apps anymore, it’s all about minimizing the gluecode written for the 1200 dependencies that make your app buzzword-compliant.

commandlinefan 3 hours ago

Amazing what you can accomplish when you have more than "a sprint" to deliver something and no project manager asking "are you done yet?"

sonzohan 2 hours ago

Recovering game dev here

The publisher for this game was Activision. They absolutely had deadlines, lots of (1987) money invested in this, outsourced to a third party company in Hungary, had the outsource team fail, moved development platforms a few times, wrote a programming language and a game engine, and then became the best selling C64 game.

Very much development hell.

YasuoTanaka 16 hours ago

It's kind of amazing how much of those old games was actual logic instead of data.

Feels like they were closer to programs, while modern games are closer to datasets.

hcs 12 hours ago

Chris Crawford called this "process intensity", he noted it at least back to 1983 with Dragon's Lair, discussed in this 1987 article https://www.erasmatazz.com/library/the-journal-of-computer/j...

makapuf 7 hours ago

Funny because I rewrote a bad port of dragons lair for a custom console with a tiny engine and huge dataset relatively, each frame having one "if press X goto frame Y" instruction.

Steve16384 11 hours ago

Pretty much every 8-bit computer game of 1987 or earlier (before the 128kB machines became popular) were < 40Kb? The Spectrum and Commodore combined probably had a library in excess of 50,000 games.

forinti 6 hours ago

I love how you can put all the games ever made for a given 8 bit platform on a single flash drive.

socalgal2 12 hours ago

Most games back then where small. An C64 only had 64k and most game didn't use all of it. An Atari 800 had max 48k. It wasn't until the 1200 that it went up. Both systems are cartridge based games, many of which were 8k.

Honestly though, I don't read much into the sizes. Sure they were small games and had lots of game play for some defintion of game play. I enjoyed them immensely. But it's hard to go back to just a few colors, low-res graphics, often no way to save, etc... for me at least, the modern affordances mean something. Of course I don't need every game to look like Horizon Zero Dawn. A Short Hike was great. It's also 400meg (according to steam)

boptom 11 hours ago

The modern classic, Animal Well, is only 35mb in size!

https://store.steampowered.com/app/813230/ANIMAL_WELL/

a96 6 hours ago

Wow, you could fit tens of those in one bit!

(sorry)

the_af 7 hours ago

> Sure they were small games and had lots of game play for some defintion of game play. I enjoyed them immensely. But it's hard to go back to just a few colors, low-res graphics, often no way to save, etc... for me at least, the modern affordances mean something.

On one hand, you're of course right. It is hard to go back, except for the nostalgia.

On the other, do you know there is a scene of people still making brand new games for the Commodore 64 (and other home computers)? And selling them, too, these are not just free games. Of course the target audience is themselves, they make, sell and buy games within the community, but the point is it still exists.

Also there are artists making art in C64 graphics resolutions and color modes, and even PETSCII art enthusiasts (PETSCII is C64's text mode, which had some interesting symbols which facilitate creativity).

shiroiuma 11 hours ago

>But it's hard to go back to just a few colors, low-res graphics, often no way to save, etc... for me at least, the modern affordances mean something.

All those old games have a way to save now, if you run them in an emulator as is commonly done these days. That's how I played through Metroid and finally beat the mother brain in just a day or two during the pandemic.

Findecanor 9 hours ago

> “The Last Ninja” was 40 kilobytes

I have got 1.1 GB of MP3s with just remixes of the music from the three games, some of which are from a Kickstarter from the composer for the second.

chmod775 15 hours ago

That short video of the game on twitter is 11.5MB, or about 300x larger than the game itself.

Dwedit 14 hours ago

X264 supports a lossless mode without chroma subsampling, which produces very good compression for raw emulator captures of retro game footage. It is much better than other codecs like HuffYuv, etc.

But for some reason, Firefox refuses to play back those kinds of files.

onion2k 14 hours ago

But for some reason, Firefox refuses to play back those kinds of files.

And that reason is because x264 is a free and open source implementation of the H.264 codec, and you still need to pay a license to use the patented technology regardless of how you do that. Using a free implementation of the code doesn't get you a free license for the codec.

Narishma 11 hours ago

anthk 13 hours ago

latch 15 hours ago

I'm not sure this is particularly telling. You can write a tiny program that generates a 4K image, and the image could be 1000x larger.

Or, if I write a short description "A couple walks hand-in-hand through a park at sunset. The wind rustles the orange leaves.", I don't think it would be surprising to anyone that an image or video of this would be relatively huge.

drob518 2 hours ago

We made the most of limited resources back then. Back in 1980, I was living large with my 64KB Apple II with dual 140KB floppy drives and a 10 inch (9 inch? I can’t quite remember) amber monochrome monitor. Most had less.

praptak 3 hours ago

I remember playing a version of this game on ZX Spectrum but I cannot find it on the internet. I remember it had bees that you had to avoid and a boat which you were able to untie so that it floats down a stream.

Anybody remember this one?

andai 8 hours ago

I shipped a browser game that was 8KB. Okay, plus 30 million lines of Chromium ;)

Most of my games are roughly in that range though. I think my MMO was 32KB, and it had a sound effects generator and speech synth in it. (Jsfxr and SAM)

I built it in a few days for a game jam.

I'm not trying to brag, I'm trying to say this stuff is easy if you actually care. Just look at JS13K. Every game there is 13KB or below, and there's some real masterpieces there. (My game was just squares, but I've seen games with whole custom animation systems in them.)

Once you learn how, it's pretty easy. But you'll never learn if you don't care.

You have to care because there's nothing forcing you. Arguably The Last Ninja would have been a lot more than 40KB if there weren't the hardware limitations of the time.

They weren't trying to make it 40KB, they were just trying to make a game.

In my case, I enjoy the challenge! (Also I like it when things load instantly :)

I think I'll make a PS1 game next. I was inspired by this guy who made a Minecraft clone for Playstation:

https://youtu.be/aXoI3CdlNQc?is=sDNnrGbQGJt_qnV6

P.S. most Flash games were only a few kilobytes, if you remove the music!

forinti 7 hours ago

I was comparing games prices last week and I found that prices from the 80s aren't too different from modern game prices.

Elite was £20 in 1984 and that would be £66 today, which is not very different from what a good game for the PS5 costs today.

Except that games then were made by one or two people and nowadays games are made by teams with coders, musicians, artists, etc.

matthewkayin 4 hours ago

Yeah, the games industry is in a pretty big crisis right now, and I think change needs to happen both ways:

Consumers need to understand that keeping games at the same price for decades despite rising costs and inflation is not realistic. If they want the industry to thrive, they need to be ok with games being more expensive.

Meanwhile, developers need to stop making games so expensive. This is an entertainment industry / corpo problem, really. Companies have seen the big profits and decided that only the big profits will do, which means you need to make a big open world cinematic experience, which is expensive, and because it's expensive, they won't take risks on making anything actually interesting.

The only way gaming moves forward is if we make riskier games that cost less to produce, which is why indies are the ones making the good games these days.

tralarpa 8 hours ago

A few years ago, I decompiled a good part of the PC version of Might & Magic 1 for fun. According to Wikipedia, it had been released in 1986, although I don't know whether that refers to the PC version or to the original Apple II version.

It is a quite big game: the main executable is 117KB, plus around 50 overlay files of 1.5 KB each for the different dungeons and cities, plus the graphics files. I guess it was even too big for the average PC hardware at that time, or it was a limitation inherited from the original Apple II version: When you want to cast a spell you have to enter the number of the spell from the manual, maybe because there was not enough memory to fit the names of the 94 spells into RAM. Apart from that and the limited graphics and the lack of sound, the internal ruleset is very complete. You have all kind of spells and objects, capabilities, an aging mechanism, shops, etc.. The usual stuff that you also see in today's RPGs.

The modern uninstall.exe that came with it (I bought the game on GOG) was 1.3MB big.

Pannoniae 7 hours ago

>When you want to cast a spell you have to enter the number of the spell from the manual, maybe because there was not enough memory to fit the names of the 94 spells into RAM

Probably not ;) "Enter things from a manual" was a tried old copy protection technique. If you used the warez version you presumably did not have a manual so you got stuck. This didn't run on the 8008 or whatever, I'm sure the game could have known the names of spells fairly easily.

tralarpa 6 hours ago

Ah, that makes more sense than my theory. It's a weak copy protection method, though, as you can just try and see what happens, and I think they dropped it in M&M3.

tiku 6 hours ago

A lot of trial and error. I've built graphical tools with GD in PHP, the difficult part for me what that the coordinates where inverted.. I only knew how to draw lines and pixels, but I got the job done.

pronik 8 hours ago

Around the time DirectX came around and first games requiring it appeared, which in my memory coincided with hard drives getting way bigger and first games being delivered on a CD instead of floppies, I've been apalled at how I could see literal BMPs being written to disk during the installation. This was the same time when cracked games were being distributed via BBS at a fraction of the original size with custom installers which decompressed MP3s to their original WAV files. I've asked the same questions then: why WAV, why BMP, why the bloat? With time I've learned the answer: disk space is cheap, memory and CPU cycles are not, if you can afford to save yourself the decoding step, you just do it, your players will love it. You work with constraints you have and when there loosen up, your possibilities expand too.

xvxvx 16 hours ago

I remember this game, the way it drew itself on each screen, the nice graphics. Growing up with games on Atari, Commodore, Amstrad, and Spectrum, was a lot of fun.

By comparison, COD Modern Warfare 3 is 6,000,000 times larger at 240GB. Imagine telling that to someone in 1987.

nine_k 13 hours ago

The Last Ninja ran at resolution 160x200, with effectively 2-bit color for graphic assets. It had amazing animations for that level of detail, but all the variety of the graphics could not take too much RAM even if it wanted to.

The quest for photorealistic "movie-like" rendering which requires colossal amounts of RAM and compute feels like a dead end to me. I much appreciate the expressly unrealistic graphics of titles like Monument Valley.

regularfry 9 hours ago

Hardware sprite accelerators, the first GPUs. I swear there's something visceral you learn by programming that sort of system where you can literally see what it's doing, in the order it's doing it, which you just can't get any other way.

guerrilla 8 hours ago

That's just incredible. People used to be so much better at programming, or at least great programmers had it easier to get funded. Most of what I see today is exceptionally low quality and just getting worse with time.

midzer 8 hours ago

My website https://midzer.de/ is themed like "The Last Ninja II" which is the first game I've encountered when I was young.

bjt12345 6 hours ago

I never figured out how they did the turtle graphics in this game. The C64 didn't have whole screen bitmaps, you could either use sprites or user defined character sets, neither of which made this straightforward.

And the loading screens were also amazing, particularly for tape loading.

smcameron 5 hours ago

The TI99/4a version of the Logo language which has turtle graphics used user defined characters to implement them. There were only (I think) 128 user definable characters, and when the turtle graphics had redefined all of them to create its output, it gave the user a message, "out of ink".

vidarh 3 hours ago

As others have said, the C64 does have bitmap modes, though it's understandable not being aware of it as they weren't that commonly used for games since it was often easier to use user defined character sets as tilesets if you had repetition.

steve_taylor 5 hours ago

The C64 does have a couple of bitmap modes. The Last Ninja uses mode 3, which is multicolor bitmap mode. It occupies 9000 bytes including pixels (8000 bytes) and color RAM (1000 bytes).

EvanAnderson 6 hours ago

You might be thinking of another system (like the NES, perhaps), because the C64 has 160x100 and 320x200 bitmap modes.

hcs 4 hours ago

Even on NES a lot of games use CHR-RAM so arbitrary bitmaps are at least possible, though only a small part of the screen is unique without some rarely used mapper hardware. Zelda and Metroid mostly just use this to compress the graphics in ROM, Qix is a simple example with line drawing, Elite is an extreme one.

I made a demo of the Mystify screensaver using the typical 8KB CHR-RAM. Even with a lot of compromises it has pretty large borders to avoid running out of unique tiles. https://youtube.com/watch?v=1_MymcLeew8

EvanAnderson 3 hours ago

tombert an hour ago

Not as small as The Last Ninja, but when I was a teenager first getting into emulation, I genuinely thought there was a mistake or my download got interrupted when I downloaded Super Mario Bros. 3, because it was only like 500kb [1], and I didn't think it was possible for a game that huge to be less than a megabyte.

It is still impressive to me how much game they could squeeze out of the NES ROM chips.

[1] Or something like that, I don't remember the exact number.

zoenolan 11 hours ago

pronik 8 hours ago

Speaking of the size: my first PC, built by a family friend, had a 80MB disk, split into two partitions. The second 40MB partition had Windows 3.1 and about two Norton Commander columns full of games on it, largest of which were Wolfenstein 3D and Lost Vikings with about 1.4MB each. Truly a different era.

xg15 9 hours ago

The consequence of "space is cheap" / "If I didn't use that RAM, it would just sit there unused anyway" etc.

alex_john_m 9 hours ago

Some comments here sound like the ones I hear from car "enthusiasts" praising old engines for being simple to run and easy to fix, then complaining about modern engines being too complicated and how we should return to the "good old days", all that without taking into account the decades of progress since then.

Want to prove a point? Give me Skyrim in 64k of ram. Go ahead! I dare you!

frereubu 8 hours ago

So you can read replies etc. without having to be logged in to X: https://xcancel.com/exQUIZitely/status/2040777977521398151

neurworlds 6 hours ago

40kb and it felt like a full world... I'm burning through tokens to get AI to decide whether to go to the tavern or the market. Something went wrong somewhere

nobodyandproud 5 hours ago

The music and atmosphere was gorgeous. Fond memories of wasted youth

I never finished the game, sadly.

abrookewood 12 hours ago

God I loved that game. Don't think I ever managed to finish and now I'm tempted to try again!

aaa_aaa 12 hours ago

I played the game. Music was exceptional.

a96 6 hours ago

It really was. I was just wondering if Last Ninja 2 (Amiga) was the first game I actually liked playing. I mostly hated old games and I still don't like most games. Particularly ones with twitchy controls or platforming. LN wasn't that easy and it was very linear, but it was still somehow incredibly fun. And the music and even the graphics were great.

choam2426 9 hours ago

We live in an age of abundant memory — until you check RAM prices.

eleveriven 5 hours ago

It really puts into perspective how different the constraints were

oweiler 5 hours ago

That game felt like a graphics demo though. Almost unplayable.

anthk 11 hours ago

Some Pokémon Crystal ROMs pack a huge amount of gaming in very few MB. Z80-ish ASM, KB's of RAM.

The ZMachine games, ditto. A few kb's and an impressive simulated environment will run even under 8bit machines running a virtual machine. Of course z3 machine games will have less features for parsing/obj interaction than z8 machine games, but from a 16 bit machine and up (nothing today, a DOS PC would count) will run z8 games and get pretty complex text adventures. Compare Tristam Island or the first Zork I-III to Spiritwrak, where a subway it's simulated, or Anchorhead.

And you can code the games with Inform6 and Inform6lib with maybe a 286 with DOS or 386 and any text editor. Check Inform Beginner's Guide and DM4.pdf And not just DOS, Windows, Linux, BSD, Macs... even Android under Termux. And the games will run either Frotz for Termux or Lectrote, or Fabularium. Under iOS, too.

Nethack/lashem weights MB's and has tons of replayability. Written in C. It will even run under a 68020 System 7 based Mac... emulated under 9front with an 720 CPU as the host. It will fly from a 486 CPU and up.

Meanwhile, Cataclysm DDA uses C++ and it needs a huge chunk of RAM and a fastly CPU to compile it today. Some high end Pentium4 with 512MB of RAM will run it well enough, but you need to cross compile it.

If I had the skills I would rewrite (no AI/LLM's please) CDDA:BN into Golang. The compiling times would plummet down and the CPU usage would be nearly the same. OFC the GC would shine here prunning tons of unused code and data from generated worlds.

axegon_ 9 hours ago

Despite being a mid-late-millennial, I can see how this played out. Even compared to the second family computer my parents got in the late 90's, which was an absolute monster at the time, I do realize how many corners and shortcuts developers had to make to get a game going in a few hundred megabytes, seeing mobile games today easily exceeding 10 times that, and not just now but even 10 years ago when I was working at a company that made mobile games. These days, developers are automatically assuming everyone has what are effectively unlimited resources by 90's standards(granted they haven't transitioned to slop-coding, which makes it substantially worse). Personally, I have a very strange but useful habit: when I find myself with some spare time at work, I spin up a very under-powered VM and start running what is in production and try to find optimizations. One of the data pipelines I have is pretty much insanity in terms of scale and running it took over 48 hours. Last time(a few weeks ago actually), I did the VM thing and started looking for optimizations and I found a few, which were completely counter-intuitive at first and everyone was like "na, that makes no sense". But now the pipeline runs in just over 10 hours. It's insane how much shortcuts you force yourself to find when you put a tight fence around you.

keepamovin 7 hours ago

Yes this is a great methodology. I found developing BrowserBox (which is real time interactive streaming for remote browsers), using slow links, and a variety of different OS, really stresses parts of the system and causes improvements to be necessary that strengthen the whole.

christkv 12 hours ago

Oh man the tape loading time. I dreamed about being able to afford a disk drive.

boptom 11 hours ago

The loading music is exceptional and I enjoyed listening to it while waiting.

I still occasionally listen to it.

the_af 7 hours ago

Last Ninja has my favorite music from the C64 era.

Have you listened to the live versions by the Fastloaders? They had Ben Daglish before he passed away.

christkv an hour ago

christkv 8 hours ago

Well I got to listen to it a lot lol.

Since you enjoy SID music checkout this crazy hack someone did with 8 SID chips.

https://www.youtube.com/watch?v=nhz3vHYX0E0

cineticdaffodil 7 hours ago

I wonder could you make a fame that small by using svgs?

jml7c5 11 hours ago

Is this even correct? It was a two-sided disk, and each side was 174 KB.

reedycat 16 hours ago

Masterpieces like these are a perfect demonstration that performance relies not only on fast processors, but on understanding how your data and code compete for resources. Truly admirable. Thanks for the trip down memory lane.

lutusp 4 hours ago

> ... 40 kilobytes.

How times have changed. My best-selling program "Apple Writer", for the Apple II, ran in eight kilobytes. It was written entirely in 6502 assembly language.

mock-possum 14 hours ago

Wow that search/interact mechanic is obnoxious, you can see the player fumbling it every time, despite knowing exactly where the item is they’re trying to collect.

beautron 14 hours ago

This is sort of the defining mechanic of these games in my memory. The first thing that pops into my head when I think of Last Ninja is aligning and realigning myself, and squatting, awkwardly and repeatedly (just like a real ninja, lol), until that satisfying new item icon appears. Perhaps surprisingly, these are very fond memories.

This mechanic is augmented by not even always knowing which graphics in the environment can be picked up, or by invisible items that are inside boxes or otherwise out of sight (I think LN2 had something in a bathroom? You have to position yourself in the doorway and do a squat of faith).

The other core memory is the spots that require a similarly awkward precision while jumping. These are worse, because each failure loses you one of your limited lives. The combat is also finicky. I remember if you come into a fight misaligned, your opponent might quickly drain your energy while you fail to get a hit in.

At the time, it seemed appropriate to me that it required such a difficult precision to be a ninja. I was also a kid, who approached every game non-critically, assuming each game was exactly as it was meant to be. Thus I absolutely loved it, lol.

bni 8 hours ago

> LN2 had something in a bathroom?

Toilet flush chains. You entered two different park restrooms (both marked F) and combined them to a nunchuks.

medwards666 12 hours ago

> LN2 had something in a bathroom? You have to position yourself in the doorway and do a squat of faith)

Sounds like every time I go to the bathroom ... :D

nekooooo 2 hours ago

joysticks only had one fire button.

emsign 8 hours ago

And it was one of the best games ever made. Back in the day equivalent to a AAA tier game of today.

khaledh 8 hours ago

Constraints breed creativity.

userbinator 15 hours ago

The same size as Super Mario Bros. (NES, 1985)

cubefox 13 hours ago

A game which was actually 40 kilobytes: Super Mario Bros. It had 32 side-scrolling levels.

benchloftbrunch 7 hours ago

27 unique levels. 40KB minus a handful of spare bytes and some unused code. The max the NES can support without mappers. Modern NES homebrew and demoscene can do fancier stuff with this budget given the extra decades of learned tricks, but for the state of console gaming in 1985, SMB1 is damn impressive.

Also remember all of that was ROM, the NES had a mere 2 kilobytes of RAM for all your variables and buffers.

derefr 2 hours ago

> I still struggle to comprehend, even in the slightest, how programmers back then did what they did - and the worlds they created with the limitations they had to work with.

Highly related: two videos covering exactly how they fit...

- Super Mario Bros 1 into 40KiB (https://www.youtube.com/watch?v=1ysdUajrhL8)

- and Super Mario Bros 2 into 256KiB (https://www.youtube.com/watch?v=UdD26eFVzHQ)

I highly advise watching the actual videos to best understand, since all the techniques used were very likely devised from a game-dev perspective, rather than by invoking any abstract CS textbook learning.

But if I did want to summarize the main "tricks" used, in terms of such abstract CS concepts:

1. These old games can be understood as essentially having much of their data (level data, music data, etc) "compressed" using various highly-domain-specific streaming compressors. (I say "understood as" because, while the decompression logic literally exists in the game, there was likely no separate "compression" logic; rather, the data "file formats" were likely just designed to represent everything in this highly-space-efficient encoding. There were no "source files" using a more raw representation; both tooling and hand-edits were likely operating directly against data stored in this encoding.)

2. These streaming compressors act similar to modern multimedia codecs, in the sense that they don't compress sequences-of-structures (which would give low sequence correlation), but instead first decompose the data into distinct, de-correlated sub-streams / channels / planes (i.e. structures-of-sequences), which then "compress" much better.

3. Rather than attempting to decompose a single lossless description of the data into several sub-streams that are themselves lossless descriptions of some hyperplane through the data, a different approach is used: that of each sub-channel storing an imperative "painting" logic against a conceptual mutable canvas or buffer shared with other sub-channels. The data stream for any given sub-channel may actually be lossy (i.e. might "paint" something into the buffer that shouldn't appear in the final output), but such "slop"/"bleed" gets overwritten — either by another sub-channel's output, or by something the same sub-channel emits later on in the same "pass". The decompressor essentially "paints over" any mistakes it makes, to arrive at a final flattened canvas state that is a lossless reproduction of the intended state.

4. Decompression isn't something done in its entirety into a big in-memory buffer on asset load. (There isn't the RAM to do that!) But nor is decompression a pure streaming operation, cleanly producing sequential outputs. Instead, decompression is incremental: it operates on / writes to one narrow + moving slice of an in-memory data "window buffer" at a time. Which can somewhat be thought of as a ring buffer, because the decompressor coroutine owns whichever slice it's writing to, which is expected to not be read from while it owns it, so it can freely give that slice to its sub-channel "painters" to fill up. (Note that this is a distinct concept from how any long, larger-than-memory data [tilemaps, music] will get spooled out into VRAM/ARAM as it's being scrolled/played. That process is actually done just using boring old blits; but it consumes the same ring-buffer slices the decompressor is producing.)

5. Different sub-channels may be driven at different granularities and feed into more or fewer windowing/buffering pipeline stages before landing as active state. For example, tilemap data is decompressed into its "window buffer" one page at a time, each time the scroll position crosses a page boundary; but object data is decompressed / "scheduled" into Object Attribute Memory one column at a time (or row at a time, in SMB2, sometimes) every time the scroll position advances by a (meta)tile width.

6. Speaking of metatiles — sub-channels, rather than allowing full flexibility of "write primitive T to offset Y in the buffer", may instead only permit encodings of references to static data tables of design-time pre-composed patterns of primitives. For tilemaps, these patterns are often called "meta-tiles" or "macro-blocks". (This is one reason sub-channels are "lossy" reconstructors: if you can only encode macro-blocks, then you'll often find yourself wanting only some part of a macro-block — which means drawing it and then overdrawing the non-desired parts of it.)

7. Sub-channels may also operate as fixed-function retained-mode procedural synthesis engines, where rather than specifying specific data to write, you only specify for each timestep how the synthesis parameters should change. This is essentially how modular audio synthesis encoding works; but more interestingly, it's also true of the level data "base terrain" sub-channel, which essentially takes "ceiling" and "ground" brush parameters, and paints these in per column according to some pattern-ID parameter referencing a table of [ceiling width][floor height] combinations. (And the retained-mode part means that for as long as everything stays the same, this sub-channel compresses to nothing!)

8. Sub-channels may also contain certain encoded values that branch off into their own special logic, essentially triggering the use of paint-program-like "brushes" to paint arbitrarily within the "canvas." For example, in SMB1, a "pipe tile" is really a pipe brush invocation, that paints a pipe into the window, starting from the tile's encoded position as its top-left corner, painting right two meta-tiles, and downward however-many meta-tiles are required to extend the pipe to the current "base terrain" floor height.

9. Sub-channels may encode values ("event objects") that do not decode to any drawing operation to the target slice buffer, but which instead either immediately upon being encountered ("decompression-time event objects") or when they would be "placed" or "scheduled" if they were regular objects ("placement-time event objects"), just execute some code, usually updating some variable being used during the decompression process or at game runtime. (The thing that prevents you from scrolling the screen past the end of map data, is a screen-scroll-lock event object dropped at just the right position that it comes into effect right before the map would run out of tiles to draw. The thing that determines where a "warp-enabled pipe" will take you, is a warp-pipe-targeting event object that applies to all warp-enabled pipes will take you after it runs, until the next warp-pipe-targeting event object is encountered.)

If at least some of these sub-channels are starting to sound like essentially a bytecode ISA for some kind of abstract machine — yes, exactly. Things like "event objects" and "brush invocations" can be more easily understood as opcodes (sometimes with immediates!); and the "modal variables" as the registers of these instruction streams' abstract machines.

[continued...]

derefr 2 hours ago

10. The interesting thing about these instruction streams, though, is that they're all being driven in lockstep externally by the decompressor. None of the level-data ISAs contain anything like a backward JMP-like opcode, because each level-data sub-channel's bytecode interpreter has a finite timeslice to execute per decompression timestep, so allowing back-edges [and so loops] would make the level designers into the engine developers' worst enemy. But most of the ISAs do contain forward JMPs, to essentially encode things like "no objects until [N] [columns/pages] from now." (And a backward JMP instruction does exist in the music-data parameterized-synthesis sub-channel ISA [which as it happens isn't interpreted by the CPU, but is rather the native ISA of the NES's Audio Processing Unit.] If you ever wondered how music keeps not only playing but looping even if the game crashes, it's because the music program is loaded and running on the APU and just happily executing its own loop instructions forever, waiting for the CPU to come interrupt it!)

11. These sub-channel ISAs are themselves designed to be as space-efficient as possible while still being able to be directly executed without any kind of pre-transformation. They're often variable-length, with most instructions being single-byte. Opcodes are hand-placed into the same kind of bit-level Huffman trie you'd expect a DEFLATE-like algorithm to design if it were tasked with compressing a large corpus of fixed-length bytecode. Very common instructions (e.g. a brush to draw a horizontal line of a particular metatile across a page up to the page boundary) might be assigned a very short prefix code (e.g. `11`), allowing the other six bits in that instruction byte to to select a metatile to paint with from a per-tilemap metatile palette table. Rarer instructions, meanwhile, might take 2 bytes to express, because they need to "get out of the way of" all the common prefixes. (You could think of these opcodes as being filed under a chain of "Misc -> Misc -> Etc -> ..." prefixes.)

IMHO, these are all (so far) things that could be studied as generalizable data-compression techniques.

But here are two more techniques that are much more specific to game-dev, where you can change and constrain the data (i.e. redesign the level!) to fit the compressor:

12. Certain ISAs have opcodes that decode to entirely-distinct instructions, depending on the current states of some modal variables! (My guess is that this came about either due to more level features being added late in development after the ISAs has mostly been finalized; or due to wanting to further optimize data size and so seeing an opportunity to "collapse" certain instructions together.) This mostly applies to "brush" opcodes. The actual brush logic they invoke can depend on what the decoder currently sees as the value of the "level type" variable. In one level type, opcode X is an Nx[floor distance] hill; while in another level type, opcode X is a whale, complete with water spout! (In theory, they could have had an opcode to switch level type mid-level. Nothing in this part of the design would have prevented that; it is instead only impractical for other reasons that are out-of-scope here, to do with graphics memory / tileset loading.)

13. And, even weirder: certain opcodes decode to entirely-distinct instructions depending on the current value of the 'page' or 'column' register, or even the precise "instruction pointer" register (i.e. the current 'row' within the 'column'). In other words, if you picture yourself using a level editor tool, and dragging some particular object/brush type across the screen, then it might either "snap" to / only allow placement upon metatiles where the top-left metatile of the object lands on a metatile at a position that is e.g. X%4==1 within its page; or it might "rotate" the thing being dragged between being one of four different objects as you slide it across the different X positions of the metatile grid. (This one's my favorite, because you can see the fingerprint of it in much of the level design of the game. For example: the end of every stage returns the floor height to 2, so that "ground level" is at Y=13. Why? Because flagpole and castle objects are only flagpole and castle objects when placed at Y=13!)

krisgenre 10 hours ago

.. and Claude Code for Linux, the CLI binary is 200+ mb :(

1970-01-01 7 hours ago

"I still struggle to comprehend, even in the slightest, how programmers back then did what they did - and "

First of all, if you're going to LLMize your tweets, do it correctly and run a 2nd pass after you're done editing. Second, read a book. That's how we learned things in 1987.

https://archive.org/details/commodore-64-programmers-referen...

y-curious 7 hours ago

The LLM-use witch hunt accusations are rampant on every single article. That snippet doesn’t sound like an LLM to me.

Have your coffee and consider that this person was just complimenting coders from back in the day.

1970-01-01 6 hours ago

Doesn't matter what you feel it sounds like - the data held within the syntax, and this is the biggest one.

https://tropes.fyi/tropes-md#:~:text=The%20single%20most%20c...