Project Nomad – Knowledge That Never Goes Offline (projectnomad.us)

257 points by jensgk 7 hours ago

Animats 17 minutes ago

There's a company which sells something like this, as "Prepper Disk".[1]

In the 1950s, US Civil Defense had a set of microfilms on how to rebuild society. These were packaged with a sunlight reader and stored in larger fallout shelters. Someone should find one of those.

[1] https://www.prepperdisk.com/

adsharma 3 hours ago

So this thing is based on Kiwix, which is based on the ZIM file format.

In the meanwhile, wikipedia ships wikidata, which uses RDF dumps (and probably 8x less compressed than it should be).

https://www.wikidata.org/wiki/Wikidata:Database_download

There is room for a third option leveraging commercial columnar database research.

https://adsharma.github.io/duckdb-wikidata-compression/

jrm4 29 minutes ago

And for those who are only vaguely familiar, this ZIM file format is not the same as the https://zim-wiki.org one.

hofrogs 20 minutes ago

I am actually only vaguely familiar and I was wondering about that every time I saw the format referenced but never bothered to check, your comment is informative!

hamstergene 12 minutes ago

Normally I cringe at doomsday preppers but given how many dictators out there love the idea to cut their country off Internet whenever anything starts going not in their favor, I imagine a lot of people may find this useful.

I wouldn’t want to lose access to knowledge how to fix a sink or which medication is better, just because the local kingface currently feels that free exchange of opinions about him threatens his kingship.

Aargau 25 minutes ago

Closing on 40 acres in Panama for an eco-resort.

I was planning to build my own offline repository, but will check out this repo.

Yokohiii 4 hours ago

I like the idea of an LLM that acts as a public knowledge base. But that doomsday framing on the site is pretty annoying.

waynerisner 3 hours ago

I think there’s a difference between doomsday framing and preparedness.

Offline access and local models aren’t about assuming collapse—they’re about treating knowledge as infrastructure instead of something implicitly guaranteed.

That feels more like resilience than pessimism.

dogma1138 43 minutes ago

If current frontier online LLMs are made inaccessible due to a local or global cataclysmic event running models locally will be the least of your concerns.

This isn’t prepping for anything it’s cosplaying as a vault dweller.

P.S. Having TED talks as part of the “educational” curriculum of this project is probably the biggest circle jerk imaginable.

adsharma 3 hours ago

This is not just a random idea.

AlexNet -> Tansformers -> ChatGPT -> Claude Code -> Small LMs serving KBs

Large LLMs could have a role in efficiently producing such KBs.

russellbeattie 2 hours ago

Doomsday may not be the end of the world, but simply living in a country where you're being unjustifiably bombed by a foreign government lead by a delusional sociopath, and so access to information sources becomes limited.

dogma1138 38 minutes ago

You’ll be hanged from a construction crane if they’ll catch you with this project in Iran… :)

DoctorOetker an hour ago

What Gulf state do you live in? UAE?

nelsonic an hour ago

For anyone wanting the video explanation from the creator, watch: https://youtu.be/P_wt-2P-WBk

cstaszak 2 hours ago

I'm a fan of "civilization in a box" kinds of projects. However the ZIM file format leaves a lot to be desired in 2026. I've been exploring a refreshed, alternative approach: https://github.com/stazelabs/oza

I do think having an LLM as an optional "sidecar" is a useful approach. If you can run a meaningful Ollama instance alongside your content, great!

codeveil 14 minutes ago

ZIM or not, I think the “LLM as optional sidecar” part is the right idea.

The durable asset is the knowledge base itself. A local model can be useful on top, but it should stay a layer, not become the dependency.

Lapra 3 hours ago

In a world where this is useful, you aren't going to be spending your precious battery on running an LLM...

desireco42 3 hours ago

Solar cells work no matter what, I agree that maybe less processing is more useful but LLM is uniqely useful as well

layer8 2 hours ago

No need for a battery, you just need someone to hit the pedals on that dynamo.

qingcharles 2 hours ago

This is not true for me. I would want an LLM after the apocalypse. I'd become like the Wizard of Oz, the all-knowing oracle.

amarant an hour ago

>Knowledge That Never Goes Offline

>What is Project N.O.M.A.D.? Node for Offline Media, Archives, and Data

That's the first header, and the first sentence of the first paragraph, and I'm confused.

DonaldPShimoda an hour ago

Two different uses of "offline", I think. From my own understanding:

To "go offline" means for something to become inaccessible that was once accessible "online". ("Offline" is an adverb.)

Meanwhile, an "offline" thing is one which is usable even without ever being "online". ("Offline" is an adjective.)

So it becomes:

> "Knowledge That Never [Becomes Inaccessible]"

> "Node for [Accessible-Without-Connection] Media, Archives, and Data"

But definitely confusing to put them right next to each other like that. You'd think a copyeditor would flag it or something.

collabs an hour ago

My guess is

>Knowledge That Never Goes Offline

Means

>Knowledge That Never becomes inaccessible to you

While the next offline means you can access it even if you don't have access to a wider network.

At least that's how I would read it.

JanisIO 5 hours ago

Anyone thought about using a Steam Deck with this? Or explored the concept of a "Nomad Deck"?

c0balt 5 hours ago

It might be an interesting idea given that the Steam Deck has reasonable amount of RAM/GPU. The main issue for a knowledge base might be the lack of a physical keyboard though.

mhitza 3 hours ago

It has built in microphones though.

wds 4 hours ago

Not sure how good of an idea a Steam Deck would be for this. If you can't access Wikipedia, I imagine a replacement for its unprotected glass screen would be harder to come by if you drop it.

JanisIO 2 hours ago

True, but I always give my devices a protective glass and put them in rugged armor. Broken screens never been a problem for me..

iandanforth 3 hours ago

I like this idea! I don't need the LLM bits, and want it to run on an old Android tablet I have lying around. Can anyone recommend similar software where I can get wikipedia / street maps / useful tutorial videos nicely packaged for offline use?

entropie 2 hours ago

A friend made this years ago. I never used it but the idea is awesome.

https://github.com/ligi/SurvivalManual

WillAdams 6 hours ago

Missing a chance to note (or configure for?) installation on a Raspberry Pi --- that'd make an affordable option to leave powered down, but ready to go in an EMI-shield/Faraday Cage.

pdpi 3 hours ago

They specifically state that they’re aiming for a “fatter” model that expects higher-end hardware, and other projects like Internet in a box already target rpi-style devices.

moffers 5 hours ago

Really clever targeting of a niche. I’d be interested to hear if they find success!

myself248 6 hours ago

kgeist 4 hours ago

Also https://kiwix.org/en/about/

I used it on a long train trip. There was no internet due to drone attacks, and with Kiwix I could browse pre-downloaded Wikis

cousinbryce 4 hours ago

I’m convinced that the multitude of off-line Internet tools is a ploy to keep any one of them from gaining traction

lucasluitjes 4 hours ago

The ones mentioned in this thread all use Kiwix for off-line wikipedia, OSM for maps, Khan for educational videos. It looks like internet-in-a-box is aimed at working well on low-powered devices, whereas nomad expects beefy hardware and includes local AI. Not sure how WROLPi differs from internet-in-a-box.

Maybe it's like linux distros: all based on the same software, but optimized for different use-cases or preferences.

rtibbles 3 hours ago

leowoo91 2 hours ago

It could use some own wisdom not to use nodejs..

balkanist 37 minutes ago

This is really cool. Having offline Wikipedia + local LLMs in a single bundle is a great combo for emergency preparedness. Do you have any benchmarks on how it performs on lower-end hardware? Curious about minimum specs.

ZeroCool2u 3 hours ago

See I really want this in a simpler format. Like a single file embedded database on my filesystem that I can point a single/or few tools at for my model to use when it needs.

itintheory 2 hours ago

Why does it have to have AI? Ugh.

layer8 2 hours ago

You can use Kiwix, OpenStreetMap and Kolibri as an AI-free equivalent. Adding AI to those is exactly the differentiator of this project.

pstuart 2 hours ago

I get the hate on AI for many reasons (hype, resource greediness, threat to civilization, etc), but having a local LLM that could help guide and reason about the data within seems like a win, especially if it's optional.

bpavuk 4 hours ago

turns out I have the same setup (sans local LLMs - they are pretty useless on 2018 cards) but in Obsidian :)

whatever I think might be useful later, I capture through the web clipper extension. [0]

[0]: https://obsidian.md/clipper

mohamedkoubaa 4 hours ago

Great premise for a science fiction story

shevy-java 5 hours ago

So how does that work?

WJW 4 hours ago

It never goes offline by already being offline.

tsss 6 hours ago

I was expecting the game from my childhood and was disappointed.

aquariusDue 5 hours ago

Yeah, that game was really ahead of its time. I still hold out hope some indie studio will attempt a spiritual successor.