The Forever File? What IPFS Promises (and What It Doesn’t) for Self-Hosting Archives?

There’s a certain comfort in putting a file online and thinking, this will always be here. Maybe it’s your old blog, a dead forum you loved, or a hand-rolled zine from the 2000s you want to preserve forever.

But we’ve all seen what happens. Domains expire. Hosting plans lapse. Platforms shut down. And when that happens, your files don’t just go offline, they vanish from the story entirely.

That’s why IPFS sounds like magic. A decentralized, peer-to-peer file system where content isn’t hosted in one place, but everywhere it’s wanted. No server to pay. No company to trust. Just content, addressable by its own hash, floating in a web of shared machines.

But does it really work that way? Let’s dig in.

What Is IPFS, Really?

IPFS stands for InterPlanetary File System - a protocol designed to make the web more distributed and resilient. Instead of finding content by where it lives (a URL on a specific server), IPFS finds it by what it is - a unique hash based on the file’s content.

If two people upload the same file, they get the same hash. If the file changes even slightly, it gets a new hash. It’s a system that emphasizes integrity over location.

Once added to IPFS, a file can be fetched by anyone running a node. If enough nodes “pin” the file (i.e. choose to store it permanently), the content stays available even if the original uploader disappears. At least, in theory.

Hosting Without a Host (But Not Without Help)

The big promise of IPFS is that no central server is required. You don’t need AWS or GitHub Pages or even a domain name. You just publish the content, give out the hash, and anyone with an IPFS gateway or node can retrieve it.

But here’s the thing: availability depends on demand and persistence. If you upload a file and never pin it yourself, or convince others to do so, it will eventually disappear from the network. IPFS doesn’t magically host your data. It distributes it. That’s not the same as guaranteeing it stays online.

To make something truly “forever,” you either need to run your own persistent node or use a pinning service. Tools like Web3.storage, Pinata, or Filecoin-backed solutions help fill that gap. They act a bit like traditional hosting, but with the IPFS protocol underneath.

The idea isn’t to replace all infrastructure. It’s to make that infrastructure portable and permissionless.

The Experience in Practice

Here’s what using IPFS actually looks like:

  1. You install a desktop client or use a web interface to add your files.

  2. IPFS gives you a hash (something like QmXYZ...) that acts as a permanent identifier.

  3. You can access it locally or through a public gateway like ipfs.io or cloudflare-ipfs.com.

  4. If you want others to always find it, even when your computer is off, you pin it with a service.

The real magic is that the content doesn’t need a URL anymore. It doesn’t care where it lives, just that the hash matches.

Want to mirror a lost website? IPFS can do it. Want to share a file that can’t be censored easily? IPFS works there too. Want to back up a personal digital collection without relying on a tech giant? You’re in the right place.

But be warned, if nobody pins it, nobody gets it. Archiving takes commitment.

What IPFS Is Good At

  • Integrity and versioning: You can be sure a file hasn’t been tampered with, and updates generate new hashes by design.

  • Distributed hosting: Anyone can host your content, even just by visiting it.

  • No domain required: You don’t need to maintain a .com or a server to share something.

  • Great for small static sites, PDFs, and digital assets: Especially when paired with local backups or public pins.

It’s like Archive.org, BitTorrent, and Git had a quiet meeting and decided to build a smarter hard drive.

Where It Still Breaks Down

  • Discoverability is weak: There’s no built-in search engine. If you lose the hash, you lose the file.

  • Pinning still relies on services or your own hardware: It’s not magically self-hosting forever.

  • Gateways go down or throttle requests: If you rely on public access points, they can slow or fail.

  • Dynamic content, databases, and logins? Forget it: IPFS isn’t made for live, interactive apps.

So while the dream of a forever web sounds great, it still needs caretakers. Just like a well-kept library or a seed archive, somebody has to care enough to host the file and pass it on.

Should You Use IPFS for Your Archives?

If you’re trying to preserve old sites, zines, manuals, or rare web ephemera, the answer is probably yes. At least as one layer of your strategy.

We always recommend redundancy - host it on traditional servers, mirror it in IPFS, and keep local backups. For extra protection, use archive.org’s Wayback Machine and our own Smartial extractor to grab and store page content while it’s still live.

IPFS won’t save the web on its own. But it adds a layer of resilience to the people who are trying.

And that’s what preservation has always been about - not perfection, but care in the face of decay.

Want forever? Pin it, share it, and check on it once in a while. Nothing lasts unless someone makes sure it does.