We found a stable Firefox identifier linking all your private Tor identities (fingerprint.com)

753 points by danpinto 18 hours ago

lpapez 17 hours ago

Very cool research and wonderfully written.

I was expecting an ad for their product somewhere towards the end, but it wasn't there!

I do wonder though: why would this company report this vulnerability to Mozilla if their product is fingeprinting?

Isn't it better for the business (albeit unethical) to keep the vulnerability private, to differentiate from the competitors? For example, I don't see many threat actors burning their zero days through responsible disclosure!

valve1 17 hours ago

We don't use vulnerabilities in our products.

mtlynch 16 hours ago

I don't understand what you mean. What separates this from other fingerprinting techniques your company monetizes?

No software wants to be fingerprinted. If it did, it would offer an API with a stable identifier. All fingerprinting is exploiting unintended behavior of the target software or hardware.

giancarlostoro 16 hours ago

sodality2 16 hours ago

OneDeuxTriSeiGo 15 hours ago

subscribed 15 hours ago

prophesi 11 hours ago

nurettin 9 hours ago

NoahZuniga 16 hours ago

The real reason is that fingerprint.com's selling point is tracking over longer periods (months, their website claims), and this doesn't help them with that.

vorticalbox 4 hours ago

kqp 6 hours ago

I’m going to go out on a limb and guess that you define “vulnerability” as something like “thing that will be fixed soon”. After all, Joe Random not liking a behavior doesn’t make it a vuln, there needs to be a litmus test. Am I close?

stackghost 12 hours ago

All fingerprinting is a vulnerability, unless the client opts-in.

lmz 10 hours ago

jachee 2 hours ago

Any method of “fingerprinting” and invading a browser’s privacy is inherently an exploit.

hrimfaxi 17 hours ago

They probably are not relying on it and disclosure means others can't either.

kippinsula 4 hours ago

the business answer is boring: you don't sit on a browser zero-day that your own product depends on. if it leaks form somewhere else, the blog post writes itself and the trust you've built with every privacy researcher and enterprise buyer evaporates. honestly the hiring page line alone, 'we found and reported X to Mozilla', is probably worth more than the fingerprinting edge they'd keep.

tcp_handshaker 2 hours ago

>> why would this company report this vulnerability to Mozilla if their product is fingeprinting?

Maybe because is not as serious as them and their title, made it to be? Did you read it fully?

The identifier described is not process lifetime stable, not machine stable, or profile stable, or installation stable. The article itself says it resets on a full browser restart...

So this is not a magic forever ID and not some hardware tied supercookie. Now what should we do with that title, and the authors of it?

Cider9986 8 hours ago

Being fingerprinted across Tor is different from being deanonymized—it basically just "psuedonomizes" you. You now have an identifier. It is a significant threat, but it is not hard to "psuedonomize" someone based on stylometry and some of the people with the highest threat model—operating an illegal site, will be pseudonymous anyway.

Don't get your opsec advice from HN. Check whonix, qubes, grapheneos, kicksecure forums/wikis. Nihilist opsec, Privacyguides.

grumbelbart2 5 hours ago

This fingerprint persists over private and non-private Firefox sessions until you restart Firefox. State actors might be able to connect your Google-login in FF window 1 with your tor session in FF private window 2.

sigmoid10 4 hours ago

Good opsec usually means you don't do this anyway. Don't use your anonymous browser for anything related to your real persona. In fact, don't re-use the OS between anonymous and public personas. Or even better: Don't re-use the hardware (also goes for networking). There will always be bugs across all levels of software and hardware that could eventually be chained to expose you. But if there is nothing there that could be exposed, you're already much better off by default. Even if that is very hard to achieve in practice.

realusername 4 hours ago

Usually you have TOR browser for TOR and a standard Firefox for the standard browsing so they already are two sessions.

yencabulator 14 hours ago

> the identifier can also persist [...] as long as the Firefox process remains running

Make sure to exit Tor Browser at the end of a session. Make sure not to mix two uses in one session.

SeriousM 6 hours ago

Or shut down and boot tails again. You need privacy? Take your time.

friendzis 4 hours ago

Anyone that serious about opsec should have dedicated hardware for that anyway

Phelinofist 4 hours ago

Why not tails in a VM?

negura 2 hours ago

negura 2 hours ago

the vulnerability was fixed upstream by mozilla anyway

yard2010 4 hours ago

Use a separate machine for these stuff, never mix your clean machines with the dirty ones, complete separation, different networks

bfivyvysj 8 hours ago

I learned enough about security years ago that there's basically zero chance you're secure and almost 100% chance someone is watch everything you do online.

Whether they care is entirely separate.

PoignardAzur 3 hours ago

Ah, yes, the "fuck it" approach to infosec.

jimbo808 2 hours ago

Could be accurate but governments can be profoundly incompetent even with great capability at their disposal

Amekedl 7 minutes ago

"The signal is not just stable. It also has high capacity." stopped reading right there also it's nothing that anybody using tails for example should have to worry about. Nothingburger.

bawolff 17 hours ago

From the sounds of this it sounds like it doesn't persist past browser restart? I think that would significantly reduce the usefulness to attackers.

piccirello 15 hours ago

This excerpt from the article describes the risk well.

> In Firefox Private Browsing mode, the identifier can also persist after all private windows are closed, as long as the Firefox process remains running. In Tor Browser, the stable identifier persists even through the "New Identity" feature, which is designed to be a full reset that clears cookies and browser history and uses new Tor circuits.

fc417fc802 10 hours ago

I wonder why "New Identity" wasn't implemented as a fork-and-exec with a newly created profile?

vscode-rest 5 hours ago

warkdarrior 16 hours ago

This is where you use id bridging.

1. Website fingerprints the browser, stores a cookie with an ID and a fingerprint.

2. During the next session, it fingerprints again and compares with the cookie. If fingerprint changed, notify server about old and new fingerprint.

mmooss 16 hours ago

Many users leave their browsers open for months.

allthetime 14 hours ago

Privacy and security conscious Tor users don’t.

autoexec 8 hours ago

Open enough tabs and you'd be lucky to keep firefox running for more than a couple weeks.

danlitt 4 hours ago

wongogue 5 hours ago

shevy-java 17 hours ago

Would it though? I guess state agencies already know all nodes or may know all nodes. When you have a ton of meta-information all cross-linked, they can probably identify people quite accurately; may not even need 100% accuracy at all times and could do with less. I was thinking about that when they used information from any surrounding area or even sniffing through walls (I think? I don't quite recall the article but wasn't there an article like that in the last 3-5 years? The idea is to amass as much information as possible, even if it may not primarily have to do with solely the target user alone; e. g. I would call it "identify via proxy information").

Barbing 15 hours ago

> I guess state agencies already know all nodes or may know all nodes.

Assume the same.

>The idea is to amass as much information as possible

Reminded, from 2012: https://www.wired.com/2012/03/ff-nsadatacenter/

akimbostrawman 5 hours ago

All Tor nodes are publicly known. Just knowing them doesn't help tracking at all because of onion routing, they would need access to all nodes.

https://metrics.torproject.org/rs.html

firefax 15 hours ago

The OP's link is timing out over Tor for me, but the Wayback[1] version loaded without issue.

Also, does anyone know of any researchers in the academic world focusing on this issue? We are aware that EFF has a project that used to be named after a pedophile on this subject, but we are more looking for professors at universities or pure research labs ala MSR or PARC than activists working for NGOs, however pure their praxis :-)

As privacy geeks, we have become fascinated with the topic -- it seems that while we can achieve security through extensions like noscript or ublock origin or firefox containers (our personal "holy trinity"), anonymity slips through our fingers due to fingerprinting issues. (Especially if we lump stylometry in the big bucket of "fingerprinting".)

[1] https://web.archive.org/web/20260422190706/https://fingerpri...

spelledwrong 9 hours ago

>We are aware that EFF has a project that used to be named after a pedophile on this subject

You bring this up like it's a well known incident, but my googling can find no evidence of it? The only reason not say the name of the project would be if it's common knowledge, but it's not?

ChatGPT research reckons you're making it up, and I'd be curious if you have evidence to the contrary?

firefax an hour ago

It used to be called Panoptoclik (sp?), a reference to Foucault's theory of the panopticon. Focault's extracurriculars are well documented and not everything is an "incident" -- it's a thread on fingerprinting. People who study that are aware what is now called "cover your tracks", and people who do post grads tend to be well rounded enough to have read a bit of philosophy, or at least, they did in my day.

So what happened here is basically... AI told you that something that made you suspicious because you have zero subject matter expertise is suspect?

I'm not really sure how to react to someone who has a robot affirm their anxieties other than to stand by my previous statements and give a polite pointer at some terms to look up on Wikipedia rather than feed into a clanker.

gosub100 23 minutes ago

tomrittervg 9 hours ago

Mozilla is working on it. (I know you said 'Academic', but we publish papers sometimes too.)

firefax an hour ago

I'd lump Mozilla into the bucket since it's a nonprofit and open source, it's hard to come up with an objective list of what makes an org "good" so sometimes it's been useful to fall back on the fact that at least in the states, academics are bound by the IRB.

Cynddl 13 hours ago

yes, there’s an active area of research on web fingerprint, both attacks and defences. Look at conferences like PETS for instance

firefax an hour ago

pets is a good conference.

i also like anonbib as a central repo for interesting work.

https://www.freehaven.net/anonbib/topic.html

dirasieb 10 hours ago

what are you referring to with that EFF app part?

SirMaster 16 hours ago

I question why websites can even access all this info without asking or notifying the user.

Why don't browsers make it like phones where the server (app) has to be granted permission to access stuff?

michaelt 13 hours ago

Browser fingerprinting is an unintended side-effect of things it's sorta-kinda reasonable for browsers to provide.

A user agent that says the browser's version? Reasonable enough.

Being able to ask for fonts, if the system has them? Difficult to have font support without that.

Getting the user's timezone, language and keyboard layout? Reasonable.

The size of the screen, and the size of the browser window? Difficult to lay things out without that.

Of course a video or audio player needs to know which video formats your browser supports - how else to provide the right video?

Obviously javascript can get the time, and it's trivial to figure out the system's clock error by comparing that to the time on a server.

Before you know it, almost every browser is uniquely identifiable.

fc417fc802 10 hours ago

Most of the things you've listed here don't actually seem all that reasonable to me.

User agents as a concept are rather poorly thought out across the board and not all that useful but persist because that's just how technical cruft is.

Fonts should be provided by the website; if not provided the choice should take the form of a spec sent by the website including line height, sarifs or not, monospace or not, etc. There's little to no excuse for the current font situation IMO beyond poor design decisions that became heavily entrenched.

Timezone and other obviously private metadata should never be shared without the user explicitly granting permission on a case by case basis. The status quo here is completely inexcusable as is the continued failure to fix the problem.

Size of the physical screen should never be exposed under any circumstances. The current size of the browser window is reasonable on its face but now that fingerprinting is understood to be an issue should always be heavily letterboxed unless the user consents to sharing the exact value.

Video formats should be provided by the website as a list of offerings and the browser should respond with a choice; the user could optionally intervene. There's no reason to expose the full capabilities to a remote service.

Querying the current time should be gated behind an explicit permission. There's almost never a need for it. However from a fingerprinting perspective you also have to worry about correlating the rate of clock skew across clients. That can be solved by gating access to high resolution time counters behind an explicit permission as (once again) the vast majority of services have no legitimate use for such functionality.

bblb 7 hours ago

These are all relics from the innocent 90's Internet. We had our global village and everything was fine. A couple of bad actors spamming blue pills here and there and that was it.

Now we have actual criminal organizations and other real bad actors.

I'm sure we can come up with something better than advertise our whole local computing platform on every HTTP request.

BeetleB 12 hours ago

I fantasize having a browser that I can use only for viewing content.

No applications. No mail. No need for cookies.

I can use a "regular" browser for more enhanced stuff. But for simple content consumption, we can just have a "dumb" browser that can't do much.

> A user agent that says the browser's version? Reasonable enough.

No user agent. I'm guessing it will need it for JavaScript or HTML features, and dynamically update if using an old browser, but let's just not supply a user agent and let it be the reader's burden to have a reasonably decent browser.

> Being able to ask for fonts, if the system has them? Difficult to have font support without that.

What's the fallback if the system doesn't have them?

> Getting the user's timezone, language and keyboard layout? Reasonable.

Keyboard layout is irrelevant for viewing content. For timezone and language: Yeah, I can see the use cases, but these are in a small minority. Let there be a popup when requested, and the user can specify the timezone/language as requested.

> The size of the screen, and the size of the browser window? Difficult to lay things out without that.

Let's let this new browser return only from a (small) discrete set of sizes. It will pick the size closest to the actual browser window size and send that.

> Of course a video or audio player needs to know which video formats your browser supports - how else to provide the right video?

Same answer as user agent. Either let the user pick from a selection of video formats, or just hard code a reasonable one and put the onus on the user to have a browser that supports it.

> Obviously javascript can get the time, and it's trivial to figure out the system's clock error by comparing that to the time on a server.

This hypothetical browser could just not send the time :-) For 99% of content consumption, this function is not needed.

What I'm describing should be part of "Private mode". Or browsers should have an "Ultra-private" mode that is the above. If it's too complex/risky maintaining it all in one codebase ... fine. Just have a separate browser.

Right now, if I built such a browser, I'm sure a lot of sites meant for content would break. But in my fantasy world, using "Ultra-private" would be the default, and people who make sites will target them first.

I think much of the complexity in making a web browser is all the "other" stuff. Being able to run apps, cookie/privacy management, etc.

0x62 11 hours ago

BeetleB 9 hours ago

bryan_w 11 hours ago

93po 11 hours ago

sandworm101 13 hours ago

The tor project seeks this bypass this by keeping such things standardized across users, even down to reported screen size. And there is nothing stopping the browser from fibbing as most settings dong matter all that much (ie UK v Canadian v American English).

autoexec 8 hours ago

francoi8 11 hours ago

All of these could have a set of standard non identifiable answers (eg. firefox reports the same 20 fonts, couple video formats, one among a few standard window sizes etc.) and for anything more extensive/precise, it would require the user's authorization and the user should have the option of feeding fake info (eg. fake timezone)

snailmailman 11 hours ago

autoexec 8 hours ago

t-3 15 hours ago

The most popular browser is made by an ad company. They also provide the majority of funding for their biggest competitor. Why would you expect anything different?

john_strinlai 14 hours ago

most people would expect something different from tor, surely.

briansmith 13 hours ago

subscribed 15 hours ago

Hah. It's still better than apps.

Apps have access to inconceivable amounts of identifiers and device characteristics, even on the well protected systems without Google Play services.

Barbing 15 hours ago

>Why don't browsers make it like phones where the server (app) has to be granted permission to access stuff?

Like Android phones perhaps? Unfortunate Apple gives very little granular control.

Joe_Cool 14 hours ago

Most stock android phones don't either. You usually get to control precise location, notifications, some background activity, SMS, Calls, Mic, Camera, SD Card, etc.

But most ROMs don't allow controls for WiFi, Cell data, Phone ID, Phone number, User ID, local storage, etc...

kelvinjps10 13 hours ago

troupo 14 hours ago

It's a fine line between making the web usable, fingerprinting, and peppering the user with dozens or hundreds of permissions.

And since browsers rival OSes for complexity (they are basically OSes in their own right already), any part of the system can be inadvertently exposed and exploited.

kingstnap 15 hours ago

I mean Google ain't paying for Chromium development just for the fun of it...

snowwrestler 10 hours ago

And yet this sort of endless (fingerprintable) browser feature list is what people cite when they claim that mobile Safari is somehow way behind Chrome, and how it’s a travesty that Chrome can’t natively implement all these (again, highly fingerprintable) features on the iPhone.

farfatched 12 hours ago

> Because the behavior is process-scoped rather than origin-scoped

Hmm, I'm a little confused, since in 2021 Mozilla released experimental one-process-per-site:

> This fundamental redesign of Firefox’s Security architecture extends current security mechanisms by creating operating system process-level boundaries for all sites loaded in Firefox for Desktop

https://blog.mozilla.org/security/2021/05/18/introducing-sit...

Perhaps that is not fully released?

Or perhaps it is, but IndexedDB happens to live outside of that isolation?

farfatched 12 hours ago

https://news.ycombinator.com/item?id=47868736 helps me understand that there's a sliver of behaviour that happens to be global, and this thus allows fingerprinting.

If so, cool!

self-portrait an hour ago

Why is Firefox DB open-source MPLv2.0 running .cpp indexedDBdatabses() script on the API:

namespace mozilla {

namespace dom::indexedDB {

using namespace mozilla::dom::quota;

using namespace mozilla::ipc;

using mozilla::dom::quota::Client;

b1temy 4 hours ago

> ...stored in the global StorageDatabaseNameHashtable. > This mapping: > - Is keyed only by the database name string > ... > - Is shared across all origins

Why is this global keyed only by the database name string in the first place?

The post mentions a generated UUID, why not use that instead, and have a per-origin mapping of database names to UUID somewhere? Or even just have separate hash-tables for each origin? Seems like a cleaner fix to me compared to sorting (imo, though admittedly, more of a complex fix with architectural changes)

Seems to me that having a global hashtable that shares information from all origins is asking for trouble, though I'm sure there is a good explanation for this (performance, historical reasons, some benefits of this architecture I'm not aware of, etc.).

sva_ 17 hours ago

Does Tor Browser still allow JavaScript by default? Because if you block execution of JavaScript, you won't be affected from what I understand.

angry_octet 11 hours ago

Because TBB has javascript on by default, turning it off increases your signature. It would be better if TBB defaulted to js off, with a front panel button to turn it on.

JS also dramatically improves security. TBB is stuck in a 90s mindset about privacy, as if Firefox exploits were not dime a dozen. Especially with AI making FF exploits more available, we can expect many tor sites to be actively attacking their visitors.

ux266478 11 hours ago

> turning it off increases your signature.

Tor endpoints are pretty easy to identify, there are plenty of handy databases for that, using it to begin with increases your uniqueness. If noscript was set to strictly disallow javascript by default, that decreases the degree to which it increases your signature relative to the baseline of using tor.

Then we have to account for the simple fact that many, many fingerprinting techniques rely on javascript, so taking them out of the picture reduces the unique identity that can be gleaned.

Are we absolutely, positively sure that the tradeoff is worth it? Without a strict repeatable measurement, I think I'm highly skeptical about whether or not a default of "allow" is a net boon to hiding your identity. I remember the rationale about the switch mostly being directed towards "most of the web is broken otherwise and that's bad."

angry_octet 9 hours ago

Phelinofist 4 hours ago

> JS also dramatically improves security

How so?

angry_octet an hour ago

ranger_danger 17 hours ago

Disabling JavaScript actually greatly increases your fingerprint as not many users turn it off, so that instantly puts you in a much smaller bucket that you need to be unique in. Yes, not having JS means it limits your options for gathering other details, but it also requires much less effort to be unique now without JS.

Tor Browser also doesn't spoof navigator.platform at all for some reason, so sites can still see when you use Linux, even if the User-Agent is spoofing Windows.

Springtime 16 hours ago

> Disabling JavaScript actually greatly increases your fingerprint as not many users turn it off, so that instantly puts you in a much smaller bucket that you need to be unique in.

I've heard a handful of people say this but are there examples of what I would imagine would have to be server-side fingerprinting and the granularity? Since most fingerprinting I'm aware of is client-side, running via JS. While I expect server-side checks to be limited to things like which resources haven't be loaded by a particular user and anything else normally available via server logs either way, which could limit the pool but I wonder how effective in terms of tracking uniqueness across sites.

ranger_danger 13 hours ago

throwawayqqq11 16 hours ago

I have my problems with that argument. Yes, less identifying bits means a smaller bucket but for the trackers, it also means more uncertainty, doesnt it? So when just a few others without JS join your bucket eg. via a VPN, profiling should become harder.

hypeatei 16 hours ago

> increases your fingerprint as not many users turn it off

We're talking about users of the Tor browser, and I'd be very surprised if this was the case (that a majority keep JS turned on)

Basically every Tor guide (heh) tells you to turn it off because it's a huge vector for all types of attacks. Most onion sites have captcha systems that work without JS too which would indicate that they expect a majority to have it disabled.

codedokode 16 hours ago

Honestly it seems that most of Web Standards are used mostly for fingerprinting - I think a small number of websites uses IndexedDB (who even needs it) for actually storing data rather than fingerprinting.

That's why expansion of web standards is wrong. Browser should provide minimal APIs for interacting with device and features like IndexedDB can be implemented as WebAssembly library, leaking no valuable data.

For example, if canvas provided only access to picture buffer, and no drawing routines calling into platform-specific libraries, it would become useless for fingerprinting.

Dwedit 15 hours ago

You can use a browser extension like "Local Storage Editor" to see the contents of the Local Storage of a website. So far, I've seen it used for caching long-life images (like on gmail), or used as another way to do logins instead of cookies.

troupo 14 hours ago

> You can use a browser extension like "Local Storage Editor" to see the contents of the Local Storage of a website.

Or just open dev tools

fc417fc802 10 hours ago

I'm with you up to the bit about canvas. The problem there is that if you want hardware acceleration then either you can't permit services to read back what was rendered (why do they need to do that again?) or else you're inevitably going to leak lots of very subtle platform specific details. Personally I think reading back the content of a canvas should be gated behind a permission dialog.

biosboiii 4 hours ago

Tor on Chromium, when?

Seriously, I am saddened that Chromium dominates the browser market as much as it does, but at this point the herd-immunity of Chromium is necessary to keep users safe.

keepamovin 2 hours ago

To answer "Tor on Chromium, when?", well - you can actually do this right now using BrowserBox! It has a built-in tor-run function that connects Chrome to a Tor SOCKS proxy, and it wraps any other browsing-related network calls over torsocks as well.

Because it's an isolated remote browser, you also get a lot of flexibility. You can run BrowserBox itself as an onion hidden service connected to the clearnet, or connect BrowserBox to browse over Tor, or even do both at the same time. Since this Firefox IndexedDB vulnerability relies on persisting state, you can completely avoid it by running BrowserBox (based on Chromium), and doing it ephemerally. There's actually a new GitHub action [0] that makes spinning up a purely ephemeral, disposable session incredibly easy and would be immune to this kind of process-level state tracking.

The action runs BrowserBox on a GitHub Action Runner, you can specify whether you want a CloudFlare tunnel, or a tor tunnel (which comes with torweb access). And there's a conveneince script you can use to run from the command-line - which does the setup then spits out your login link.

All you need is a BrowserBox license (not free), but then you can use it.

I would consider this a lightweight Tor-proxied Browser, not a replacement for Tor Browser, at this time as there are likely edges and leaks that the official Tor Browser has long patched. However, as cases liek this IDB bug demonstrate - no security is perfect. If you simply want a way to access tor, and add an extra "ephemeral" hop on a runner, itself over Tor, and not trying to do anything especially sensitive or life-threatening - it's probably good.

[0]: https://github.com/marketplace/actions/browserbox

[1]: https://github.com/BrowserBox/BrowserBox

Cider9986 8 hours ago

https://archive.ph/BbVZo — for those that would rather be fingerprinted by Google than fingerprint-com

Meneth 16 hours ago

I'm confused.

The IndexedDB UUID is "shared across all origins", so why not use the contents of the database to identify browers, rather than the ordering?

nneonneo 16 hours ago

There's an instructive example on the page. Suppose a page creates the databases `a,b,c,d,e,f,g,h,i,j,k,l,m,n,o,p`, then queries their order. They might get, for example `g,c,p,a,l,f,n,d,j,b,o,h,e,m,i,k`, based on the global mapping of database names to UUIDs.

The key vulnerability here is that, for the lifetime of that Firefox process, any website that makes that set of databases is going to see the exact same output ordering, no matter what the contents of those databases are. That makes this a fingerprint: it's a stable, high-entropy identifier that persists across time, even if the contents of those databases are not preserved. It is shared even across origins (where the contents would not be), and preserved after website data is deleted -- all a website has to do to re-acquire the fingerprint is recreate the databases with the same names and observe their ordering.

Joe_Cool 14 hours ago

As I understood not ANY website can see it. But the same website can see it regardless if you reset your identity in Tor Browser.

So it persists between anonymous sessions. So you could connect User A that logged out and reset the identity to User B who believed was using a fresh anonymous session and logged in afterwards.

stratos123 13 hours ago

lxgr 16 hours ago

The content is obviously scoped to an origin, or IndexedDB would be a trivial evercookie.

AgentME 16 hours ago

It's the mapping of UUIDs to databases that is shared across origins in the browser. Only the subset of databases associated with an origin are exposed to that origin.

Fokamul 3 hours ago

Imho, EU should make any fingerprinting illegal in all browsers.

And all browser devs should be required to actively fight against fingerprinting.

There is no legitimate need for fingerprinting in browsers.

jonathanstrange 2 hours ago

Fingerprinting is done by servers, not by browsers, and it is already illegal in the EU when it is done without explicit user consent and according to the GDPR data handling requirements. The GDPR covers all of this, it doesn't matter where the data comes from.

crazysim 17 hours ago

I would imagine most users of Tor are using Tor Browser. I am reading there was a responsible disclosure to Mozilla but is it me or did that section leave out when the Tor Project planned to respond or release a fixed Tor Browser? Do they like keep very close or is there a large lag?

flotzam 17 hours ago

Tor Browser is always quick to rebase on the latest Firefox ESR. They released an update the next day:

https://blog.torproject.org/new-release-tor-browser-15010/

VladVladikoff 9 hours ago

What are these databases not scoped to origin of creation like cookies?

AgentME 9 hours ago

They are. The leak is that if a webpage you visit creates several databases with certain names, the order is random but stays the same within the same browser session.

heavyset_go 10 hours ago

There are others that Cloudflare and friends use for fingerprinting.

wolvoleo 14 hours ago

Tails (without persistent storage) will mitigate this though. I'm not too concerned.

wlonkly 13 hours ago

I'm not sure it will. The problem in Tor here is that the ordering persists beyond "New Identity". It does not persist between browser restarts.

wolvoleo 10 hours ago

But that's the key thing about tails. You start it fresh every time from a clean usb stick or iso image.

It's more than a browser restart, it's a complete system wipe every time.

Tails is made on the premise that exactly this kind of trick will occur. Sometimes even persisting between browser restart. For that reason even the persistent storage is very limited. But that's optional and cautioned against for maximum anonymity.

What would be worrying with tails would be if there was some way for some hardware identifier to be exposed. Like a serial number or MAC address. But this kind of thing is exactly what it's made to protect against.

keepamovin 2 hours ago

ranger_danger 11 hours ago

You can also fingerprint browsers profile-wide across sessions without any JS, CSS or even HTML, using the favicon: https://github.com/jonasstrehle/supercookie

snailmailman 11 hours ago

I think most browsers have patched this out? i didnt do super concrete tests, but at least on my machine their demo is failing to fingerprint me across private browsing/incognito sessions as they claim. Tested in firefox and edge.

ranger_danger 10 hours ago

Not sure about Chromium-based browsers, but the author of this paper on the technique:

https://www.ndss-symposium.org/wp-content/uploads/ndss2021_1...

Says that Firefox has a bug that prevents favicons from being loaded from cache, which inadvertently protects against this technique. They filed a bug report on it in 2020 but nothing has happened with it yet: https://bugzilla.mozilla.org/show_bug.cgi?id=1618257

zzo38computer 8 hours ago

Some users disable favicons; I am one of them (although that is main because I do not use them, rather than due to that).

anthk 16 hours ago

The best for Tor would just be Links2/Links+ with the socks4a proxy set to 127.0.0.1:9050, enforcing all connection thru a proxy in the settings (mark the checkbox) and disabling cookies altogether.

angry_octet 11 hours ago

The best is probably tor in a VM, chromium in a separate VM, javascript disabled, on a private virtual network, with a egress firewall (not just guest VM firewalls, but enable those too) that only allows traffic from a specific origin port on the tor machine. You would also want the VM to spoof the processor features and unique IDs. System time drift/offset remains a vector which is hard to deal with.

Dump the rendered window pixels out to a simple viewer. Mouse movement is still a pain to deal with, but I would default to spoofing it as moving between clicks, with some image parsing logic to identify menu traversal.

Then it should reboot the browser process regularly.

I've been waiting for someone to make a packaged 'VPC in a box' incorporating networking and linked VMs.

keepamovin 2 hours ago

Your idea of "dumping the rendered window pixels out to a simple viewer" with Chromium is essentially Remote Browser Isolation (RBI). If you're looking for a packaged way to do this, BrowserBox does exactly this and has a tor-run function built-in, which:

connects Chrome to a Tor SOCKS proxy and wraps all other browsing-related network calls over torsocks. It prevents local fingerprinting leaks (like this IndexedDB ordering bug) because the browser isn't running locally at all. You can host the BrowserBox instance as an onion hidden service, use it to browse over Tor, or both.

If you want to try an ephemeral "VPC in a box" style setup where the environment is destroyed after you're done, you can easily spin it up using this new GitHub action: https://github.com/marketplace/actions/browserbox (but you need a license key, obtainable at https://browserbox.io)

This is my attempt to make it easy to spin up bbx on ephemeral infrastructure that's mostly free (GitHub Actions runners are perfect).

angry_octet an hour ago

anthk 5 hours ago

Links can force to pass all connections to a proxy, so a FW might be redundant. Forget almost mouse, Links can be render the page either to plain X11 or a terminal.

angry_octet an hour ago

fc417fc802 10 hours ago

> enforcing all connection thru a proxy in the settings (mark the checkbox)

Just use a network namespace individual pieces of software are way too easy to misconfigure.

anthk 5 hours ago

Links litteraly put your a graphical (X11) or terminal based checkbox to enable that to enforce everything through the proxy with the settings menu. Not too easy. If you are going to use Tor you shouldn't be just using Tor Browser by default neither if it enables some JS options. Firefox' base is too huge to configure so nothing ever leaks. There are too many components. A/V, WebGL, telemetry, WASM, WebRTC...

sixothree 16 hours ago

Would whonix fit that bill?

fsflover 17 hours ago

It seems Qubes OS and Qubes-Whonix are not affected.

handedness 13 hours ago

> It seems Qubes OS and Qubes-Whonix are not affected.

This is dangerously incomplete and bad advice.

Qubes OS does not work the way you seem to think it does.

Creating a new identity in the Tor Browser inside a disposable VM does not automatically stop that VM and start a new disposable VM. That initial disposable VM launches the new identity from the existing process and therefore remains vulnerable, the same as any bare metal computer running Tor Browser would.

Virtualization is not magic.

A Qubes OS user needs to spin up a new disposable Whonix VM to sidestep this attack. Creating a new identity alone is ineffective in this threat model.

If you care about these projects as much as you say you do, please stop giving harmful advice. You do it in various places on the Internet and in every thread which gives you half a chance to do so, and these projects would be better off if you either took any of the extensive well-reasoned correction many people offer you, or opted to stop making such claims. The former would be ideal, the latter still vastly preferable to the existing state of affairs.

hrimfaxi 17 hours ago

How so? If you kept a disposable VM open and just created new identities in tor browser, how does Qubes mitigate the threat here?

handedness 13 hours ago

I believe you are correct, and that this poses a significant risk for people who don't properly understand the underlying concepts.

A Qubes OS user needs to start a new disposable Whonix workstation VM to sidestep this attack, NOT create a new identity in the same disposable VM's browser, which is exactly what this attack targets.

fsflover 17 hours ago

On Qubes, you do not create a new identity in the same VM. This would go against the Qubes approach to security/privacy. Using separate VMs for independent tasks is the whole point of using Qubes.

handedness 13 hours ago

2ndorderthought 16 hours ago

In the last ten years has qubes moved on to support more hardware? Every 4 years I would try to use it only to find it didn't support any of my hardware.

handedness 15 hours ago

Qubes OS hardware support, while still far from perfect, is vastly better than it was ten years ago.

Joanna Rutkowska's understandable preference for older kernels had its advantages, but the current team is much more likely to ship somewhat newer kernels and I've been surprised by what hardware 4.3 has worked well on.

Beyond that, I'm currently running a kernel from late Feb/early Mar (6.19.5).

Driver support can still be an issue, and a Wi-Fi card that doesn't play nice with Linux in general is doing to be no different on Qubes OS.

Aachen 16 hours ago

We buy off the shelf laptops, not sure anyone ever checked that it can run Qubes specifically before trying to install it (I'm sure of at least one person: myself). Doesn't just about any x64 machine with hardware where drivers are available in standard kernels also work with Qubes? What have you bought that's not supported?

fsflover 16 hours ago

hrimfaxi 16 hours ago

No problems on framework laptop that I've run into at least.

orbital-decay 16 hours ago

Most hardware (especially GPUs) is hard to virtualize in a secure manner, which is the entire point of Qubes. People who use it typically buy compatible hardware.

fsflover 16 hours ago

fsflover 16 hours ago

Tested hardware can be found here https://qubes-os.org/hcl. New hardware is being constantly added. If you plan to switch to Qubes, consider buying something from that list or, better, certified, or community-recommended hardware linked there.

ranger_danger 17 hours ago

Source?

fsflover 17 hours ago

Different VMs result in different identifiers.

handedness 13 hours ago

LoganDark 16 hours ago

> For developers, this is a useful reminder that privacy bugs do not always come from direct access to identifying data. Sometimes they come from deterministic exposure of internal implementation details.

> For security and product stakeholders, the key point is simple: even an API that appears harmless can become a cross-site tracking vector if it leaks stable process-level state.

This reads almost LLM-ish. The article on the whole does not appear so, but parts of it do.

shevy-java 17 hours ago

Well that sucks. I guess in the long run we need a new engine and different approach. Someone should call the OpenBSD guys to come up with working ideas here.

giancarlostoro 16 hours ago

> Mozilla has quickly released the fix in Firefox 150 and ESR 140.10.0, and the patch is tracked in Mozilla Bug 2024220.

Did you even read the article at all? Ah my children did bad in school, time to replace them with new children and a different spouse. This is what you're suggesting essentially. A browser is not just something you simply make out of thin air. There's decades of nuance to browser engines, and I'm only thinking of the HTML nuances, not the CSS or JS nuances.

anthk 16 hours ago

Given the dangers of JS and WASM they could just fork Netsurf and enhance the CSS3 support. If you are a journalist, running Tor with JS and tons of modern web tech enable makes you a bright white spot in a sea of darkness.

fsflover 16 hours ago

Here you go: https://qubes-os.org.

Barbing 15 hours ago

>Why Qubes OS?

>Physical isolation is a given safeguard that the digital world lacks

>In our digital lives, the situation is quite different: All of our activities typically happen on a single device. This causes us to worry about whether it’s safe to click on a link or install an app, since being hacked imperils our entire digital existence.

>Qubes eliminates this concern by allowing us to divide a device into many compartments, much as we divide a physical building into many rooms. …

Sold

https://doc.qubes-os.org/en/latest/introduction/intro.html

handedness 13 hours ago

handedness 12 hours ago

You should note that improperly using Qubes OS, creating a New Identity inside of Tor Browser, even in a disposable Whonix workstation VM, would leave one vulnerable to this.

A user would have to manually start a new disposable VM for each identity.

fsflover 5 hours ago