Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Hi, from Iran with love!

First of all, thank you moxie and signal team for this proxy.

Until 2018, many Iranians used telegram but Iran's regime after Russia blocked this messenger. telegram released mtproxy and this proxy was helpful. Russia lifted the ban on telegram but this app is still blocked on my country. but with VPNs, many iranians still use this app. after 2018, second most popular messaging app in iran was whatsapp, until facebook's new privacy policy, like all of you, many iranians switch from whatsapp to signal. mullah's regime removed signal app from the iranian app stores and started blocking all signal traffic in the country, but they don't block whatsapp. I'm not a paranoid but it is difficult to understand for me why they didn't block whatsapp after 2018? can they break whatsapp encryption?

I have a suggestion for signal team: please put tor in the signal, tor is better than any proxys or vpns.



I just set up one of these Signal proxies. Hope it helps you and others in your country communicate freely and safely. [1]

Regarding Tor: if you want a Signal-like app that uses an onion router look at Session. [2]

It uses the same encryption protocol and very similar UI to Signal but routes all traffic through the Loki network so your traffic passes through three nodes. It is an onion network like Tor.

One other benefit of Session is the lack of metadata inherent to its design. No phone numbers or even usernames are attached to your account. You get a set of characters that looks similar to a bitcoin address and a QR code to make sharing it easier.

Of course this lacks the convenience of Signal but it’s as hard to block as Tor.

[1] https://signal.tube/#signal.xanny.family

[2] https://getsession.org


Session has:

1. An associated crypto-currency (not outright bad but weird smell IMO) [1]

2. Abandoned perfect forward secrecy and deniability [2]

3. Never completed an audit (though supposedly one is in progress) [3]

There are a million and one encrypted chat programs out there. Why should I use this one?

[1]: https://github.com/oxen-io/oxen-mobile-wallet

[2]: https://getsession.org/session-protocol-technical-informatio...

[3]: https://getsession.org/faq/


I mentioned it because it has a seamlessly built in onion routing protocol. I read further down the thread that Tor is blocked in Iran, but I’m guessing the same is unlikely to be true of Loki/Oxen simply because it isn’t nearly as well known.

The lack of metadata is also quite a unique selling point in my eyes. There’s a million encrypted messengers now sure. How many automatically connect through an onion router with zero config required and don’t require you to create an account at all, but instead assign you a random ID disconnected entirely from your phone number, email, and other personal identifiers?

It’s certainly an option to consider is the only thing I’m saying. Tor was mentioned so Session popped into my head for the reasons mentioned above.

Regarding PFS. They currently implement the Signal Protocol. Session is of course FOSS so anyone can check this. Your source does say they’re planning to fork it as the Session Protocol later this year so it integrates with their network more easily. But that’s an upcoming, unfinished project. To be honest I don’t know much about it as it’s still in development. I do know that currently Session uses the Signal Protocol through an onion router without the need to so much as create an account.

And yes the network itself is a bit of a convoluted idea that tries to do many things at once, but the fact they run on a blockchain means they already have a lot of nodes set up in different countries around the world through which to route traffic, and the reason they could build a decentralised network quite quickly despite being a relatively young project is they incentivise those node operators with cryptocurrency.

Because it is a young project they are still undergoing audit yes. This is absolutely something worth noting. It’s a relatively new project. It’s no longer in beta, but nowhere near as well established as Signal. However it’s precisely because of this it’s unlikely governments are bothering to target it yet.


Jami and Tox are completely decentralized messaging systems. They are not directly associated with "blockchain" buzzword as well so they have that going for them, too.

https://jami.net/

https://tox.chat/


Jami uses git and TLS to implement E2E encrypted chats.[1] That doesn't sound all that secure to me. I'd feel much more comfortable using a fork of Signal with an onion router.

I don't get the prejudice some have against blockchains either. It's not even like this is Keybase where they shoehorned a crypto wallet into the app. They do have a wallet but it's a totally separate application. Session is purely a messenger and nothing else, you'd have no idea a blockchain was involved at any part of the backend if you weren't told about it.

I have played around with Tox and it's a cool project, but it's been in beta since 2014 and is not well optimised for mobile at all. I don't think it'll go very far personally.

[1] https://jami.net/swarm-introducing-a-new-generation-of-group...


Session protocol is currently running on session now[1]. It’s actually the reason that they were able to allow up to 100 people in closed groups now.

[1]: https://getsession.org/session-release-roundup-10/


Looking at the technical writeup [1] it sounds like they're currently running a hybrid of the two as they do a staged rollout of the various changes they've made to the Signal Protocol.

But yes you are correct it looks like they're justifying ditching PFS by saying "if someone has your keys you're screwed anyway."

However they're not just stripping away security outright, it's more that they're betting on onion routing to cover the user instead - no one who is sniffing traffic on your WiFi network will be able to get your keys because the traffic is routed through the onion network, therefore the only way anyone would get those keys is by pwning your entire device or having physical access to it while it's decrypted which, as they note, is endgame no matter what messenger you use.

I don't necessarily think this is smart as it's best to not put all your eggs in one basket especially where security is concerned. But given all traffic goes between 5-7 nodes [1] the scope for someone without remote or physical access to your device to get your keys is extremely limited assuming their onion network is as secure as they claim.

As for deniability, they sign the message with the long-term keypair, but once the message is validated this signature is wiped. So again it pretty much comes down to relying on their onion routing to ensure this signature isn't intercepted in transit.

Finally I think it's relevant Session is really designed for a different use case than Signal - since your ID is not connected to any personal identifiers, you can wipe it whenever you want and get a new ID that has zero cryptographic connection to the old one. So while there's no ratcheting of keys, the intention isn't really for someone to stick with the same account for years like it is for Signal where your account uses your phone number as an identifier.

I'll wait for the results of the audit, if they come out and say Session is fundamentally flawed I'll happily concede. I have zero ties to this project aside from finding it useful for particular use cases. My prediction is the initial audit will find some potential vulns as they roll out more of the Session Protocol simply because it's a new fork. Probably why they're getting the audit done now. That's the responsible move when forking a crypto protocol it seems to me.

I maintain that for people who want a messenger that knows as little about them as possible, doesn't rely on a personal identifier, and connects to an onion router reliably (by comparison, using Tor on mobile is... not a good experience) it's at very least an interesting project to watch even if it still needs time to mature.

[1] https://getsession.org/session-protocol-technical-informatio...


Personally I love Session. I agree with their logic.

As we have seen, if they have your device, it’s over even if you use ratcheting etc.

https://www.forbes.com/sites/thomasbrewster/2021/02/08/can-t...


My only annoyance with the crypto currency is that it doesn’t have a good UX yet. They have stated before though that Session will always remain free for everyone. I think compensating node operators in some capacity makes sense but if it’s not implemented well, node operators feel a bit screwed over.

Regarding your footnote #2 about PFS, it said this (among other things).

> In some theoretical scenarios, these properties do protect users; however the utility of these protections in real-world scenarios is often more limited in scope than might be expected. We must also consider that these safeguards are offered at the expense of additional complexity, decreased account portability, and multi-device limitations. These protocols were simply not designed to be run over a decentralised network.

I have to say, I’ve considered the utility of it myself. It seems to me that I’m far, far more likely to have my chats compromised through my chats being stored in plain text on my devices than in a technical scenario that PFS could have prevented. What do you think?


> My only annoyance with the crypto currency is that it doesn’t have a good UX yet. They have stated before though that Session will always remain free for everyone. I think compensating node operators in some capacity makes sense but if it’s not implemented well, node operators feel a bit screwed over.

Preface: i've been in it since 2011, but I entirely agree. It's still too complicated for most users, but we as community have gone a long way from where we were 10 years ago, so we certainly have blinders as to where to improve upon.

Could you specify what it is specifically that makes the overall UX so disappointing? I ask because I think we are now entering a phase of on-boarding a lot of non-tech people onto BTC network as payment settlement networks and other misc financial services, something I already have personal experience with, but having a tech person lime these out would be a tremendous help.

Thanks!


I haven't played with their crypto wallet honestly so I can't provide any kind of informed opinion on it. It's a fork of Monero, but Monero is more widely accepted. Although Monero's wallet has a terrible UX too so if they've forked their wallet too, that'll be why.

Session has a pretty nice UX though, better than Wickr which has been around longer, although not as refined as Wire. It's pretty much a more barebones Signal without phone numbers as far as pure UX goes. It should follow the system setting for dark mode though. No idea why it doesn't.

> I have to say, I’ve considered the utility of it myself. It seems to me that I’m far, far more likely to have my chats compromised through my chats being stored in plain text on my devices than in a technical scenario that PFS could have prevented. What do you think?

To be honest I agree. It's always good to have layers of security in case there's an exploited vulnerability in one. However, it seems pretty far fetched that someone with the ability to pull off a successful attack to grab your keys in the first place is going to be stopped from grabbing past messages by PFS.

If they've got your keys they've either cracked the layers of encryption that protect them in transit (highly unlikely) or they've pwned your device (much more likely). If they've got remote code execution on your device PFS isn't going to make any difference to anything. As you say, messages stored in plaintext is a far bigger real world risk.

This is why I always have disappearing messages on. That makes me feel safer than PFS. At least then in the worst case scenario past messages simply aren't there in storage.

To put it another way, if I had to choose between disappearing messages and PFS, and I could only have one, I'd choose disappearing messages.

And it does also seem to me less like they're removing security, more like they're relying on a different form of it. They're having to strip PFS to make the protocol work reliably on their onion network. The fact all my traffic goes E2EE through 5-7 nodes, as per the technical description, should provide a strong level of protection against any traffic sniffing threat model assuming their fancy new way of doing onion routing is as secure as they say.

It seems to me that's what it really hinges on. As long as their onion router is actually secure, the E2EE messages are going to be secure in transit.

The biggest risk then would come back to unpatched exploits or 0days in the OS or side channel attacks in Session itself. That's how most attacks on messengers succeed after all, not by complex attacks on the crypto protocol but through side channel attacks in the application or exploits in the underlying OS.


Interesting tidbits I'm reading about different chat apps targeted for privacy and security. On a related question, how do you rank the various prominent apps like Signal, Telegram and Whatsapp on their cryptographic security measures?


>Abandoned perfect forward secrecy and deniability...

I suspect that is the future for encrypted messaging. Pretty much everyone ends up keeping their old messages around thus negating the value of forward secrecy[1]. Deniability ends up being just some forgability scheme in most cases[2].

So the benefit of those features turned out to not be worth the extra risk of the added complexity.

[1]: https://articles.59.ca/doku.php?id=pgpfan:forward_secrecy

[2]: https://articles.59.ca/doku.php?id=pgpfan:repudiability


I don't understand the point of reinventing many wheels. Why not build a friendly chat app on top of established onion routing protocols (Tor/I2P), or add your own onion routing backend (proxy) to established federated chat apps like Conversations, which already has perfect Tor integration for Jabber/XMPP over .onion servers?

Also, Session is promoted as a non-profit project, but following links around about LokiNet and Oxen you find out about a blockchain-based cryptocurrency, which is known to be an anti-pattern on many levels (though they use Proof-of-Service not Proof-of-Work which is slightly less worse).

Finally, Session appears to be free-software (good), but is not distributed on F-Droid, the only privacy/security-friendly app store for Android. They encourage you to download random APKs from Google Play (which requires Google Play Services malware and a google account) or Github (owned by Microsoft, though i note they sign checksums with PGP on Github so it's safer to download from there than Google Play, even though they don't provide instructions on how to verify signatures). On F-Droid, they could either have F-Droid build/distribute the package with Fdroid's PGP key, or open their own F-Droid repo with their own PGP key (like Newpipe did), or both.

I really appreciate their communication around dissent and the need to protect communications to help the people against their governments. However these three points i just noted are really shady to say the least. I understand they need to please investors to put money in their fridges, however trying to mix for-profit incentives with non-profit services to the communities is always a dead-end.


Here's my Signal proxy, I hope it helps someone:

https://signal.tube/#signal.jacko.co.uk


> can they break whatsapp encryption

They don’t have to, they just need Facebook to cooperate.


I think you mean the phone vendors, as they are the ones holding the unencrypted chat history in the users cloud storage. Facebook themselves do not have access to the chat logs (unless they are compelled to inject keys).


They could literally have a hidden function in WhatsApp that scoops up all your chat history and sends it to Facebook if the government ask them to. It’s closed source. No one has a clue what it’s doing.

To be clear I’m not suggesting this is absolutely happening. I’m merely pointing out it’s entirely possible from a technological perspective given it’s closed source software owned by Facebook. That’s not a recipe for privacy.


To be clear about the threat vector, there's also nothing stopping signal from doing the same if they wanted to. Its impossible to tell if the version of signal you download from the app store is unmodified from the code you can find on github. I trust signal more than I trust facebook, but if you use signal, even though its opensource you still have to trust them not to put anything funky in the binary they upload to apple/google.

I'd love for iOS and android to add some sort of OS-level application hash or something. "This app was compiled with xcode version X / llvm version Y with this set of options. The resulting binary hashes to ZZZ". That way with the source code you could verify that the binary on your phone is unchanged.

(Another approach would be to get apple / google to do the compilation themselves from the project on github. If apple builds my project, they could put some signed metadata in the bundle saying "We (apple) compiled this from git SHA XXX")


Hi there, Signal Android dev here. We have reproducible build steps, so you actually can verify the code is the same :)

https://github.com/signalapp/Signal-Android/tree/master/repr...


Reproducible builds do not help to determine if the version you download via the Play Store (or, for those on enterprise devices, any pre-installed corporate stores) is the same as you build - Play Store presents no real means to verify that. This includes any auto-updates if they are enabled.

It's an issue with Play Store as a delivery channel, the individual app in question can't do much about that.

Reproducible builds help if you: - download the APK separately (includng from the Signal website, or some of the other sources) - install the file locally via sideload - disable updates (!)


The instructions posted by the dev directly include instructions for pulling the APK from your phone which was installed through the Play Store.

https://github.com/signalapp/Signal-Android/tree/master/repr...


This is very true. Reproducible builds for mobile apps would be far superior. You can build Signal from source for Android if you wish, although obviously this is a massive pain to do for each update, there’s absolutely nothing stopping you from doing it.

On iOS it's a lot more difficult to get the required certificates from Apple but you can run your own build in Xcode and deploy it to your personal device if you are a registered Apple developer.

While reproducible builds are obviously the gold standard, for apps you install from the Play Store or the App Store, developers sign the apps that get distributed with their own private keys. As Google and Apple don’t have access to these it should be verifiable that the apps are not tampered with.

There is an exception here with the Play Store, where there is an opt-in option for Google to sign the app on your behalf [1], but I think we can safely assume Signal are manually signing with their own private keys.

In any case it's easy to just grab an APK from an Android device and check signatures for yourself.

For iOS though, no surprises here it’s locked down. Although from what I gather reading Apple’s security documentation, it confirms that apps must be signed by developers with their private keys. [2] But unlike Android there’s sadly no way I can tell for the user to independently verify this without jailbreaking.

But ultimately, short of building each version yourself, all this is moot if you distrust the developers.

[1] https://developer.android.com/studio/publish/app-signing [2] https://manuals.info.apple.com/MANUALS/1000/MA1902/en_US/app...


Are there any instructions to build the app yourself for signal?


Yes, they have some docs on GitHub for how to do it: https://github.com/signalapp/Signal-Android/wiki/How-to-buil...

You can build the iOS version too for development: https://github.com/signalapp/Signal-iOS/blob/master/BUILDING...

I haven't done it before but you should even be able to deploy that build to your phone in theory: https://codewithchris.com/deploy-your-app-on-an-iphone/

It's unclear to me if there are any restrictions on iOS that would prevent you from doing that.


I spent a few hours trying to get a local build of signal-ios working a few weeks ago, in order to write a PR fix a bug with lost voice messages. The xcode project uses a plethora of device entitlements I'm not allowed to have (since I don't have the proper signal signing key). Even after a couple hours of tweaking to get it building and deployed to my device, its currently crashing on startup because it can't access some special signal local device store.

You can certainly get your own build working (without notifications and other features). But personally I found it prohibitively difficult to do so.


I think you will have a problem when it comes to push notifications. I doubt a local build would be able to receive push notifications addressed to App Store builds.


I just realized one issue with this is that their latest production branch is private so you would receive delayed updates.


Reverse engeneering is a thing, though. I would think, there is fame to be gained to show such a behavior from whatsapp, so some hackers could feel motivated to do this from time to time.


Absolutely. Of course hackers are reverse engineering WhatsApp, that's how all those nasty exploits it has keep getting sold to governments by the NSO Group.

But reverse engineering is a skill in itself and modern day smartphone OS's use a lot of code obfuscation when apps are compiled. This effectively means even those talented hackers are going through the reverse engineering process pulling at threads until they get lucky.

Reverse engineering (in this context, at least) doesn't just show you the code as the developer wrote it. And FB hires a lot of very clever people including cybersecurity experts who could sneak these things in using innocent looking code scrambled around the app. Even open source projects are at risk of having backdoors put in that pass review and simply look like innocent bugs if they get discovered, let alone closed source apps that have to be reverse engineered.

Again not going conspiracy nut and saying that's what FB is doing. Just saying it'd be very easy for FB to hide it if they were doing it.

To me the biggest confirmed weakness of WhatsApp is the cloud backups. E2EE is pointless when the message database is synced up to iCloud or Google Drive. WhatsApp even tells you this itself. When you enable cloud backups (and they keep bugging you until you do it) it literally tells you the backups aren't secured by E2EE. [1] Because, well, of course they aren't.

[1] https://faq.whatsapp.com/iphone/chats/how-to-back-up-to-iclo...

"Media and messages you back up aren't protected by WhatsApp end-to-end encryption while in iCloud."


The same is true of Signal in most practical ways. You can only run it on platforms that are fundamentally closed-source (either iOS or Google Play Services), so there's no reason to believe the RNGs it uses (and therefore all your session keys) are not backdoored. And you can only install it through official app stores where it's difficult or impossible to inspect what binary you have or "pin" a given version. So I don't see that it's meaningfully more secure than WhatsApp.


The mere fact it hoovers up less metadata alone makes it more private. I also trust the developers more way way more than I trust Facebook. That's a personal preference though and if you trust WhatsApp and don't mind that it leaks contacts and other metadata to Facebook to profile you then use WhatsApp.

If I wanted to I could install a fork of Signal that doesn't require Google Play [1] and run it on any non-Google Android build. I would do if it wasn't for the fact I'm currently using an iPhone.

[1] https://langis.cloudfrancois.fr/


I run Signal on GrapheneOS and find this comment incorrect and borderline offensive.


I see that Signal no longer depends on Google Play Services specifically. However it's still the case that it depends on proprietary Google code (it just includes that code in its own APK now) and still can't practically be installed without auto-update (again, it just includes that in its APK).


The "proprietary Google code" is a library with a well defined API, you can see what it has access to. I agree that Signal should take it out, but it's not an especially big deal from a security perspective.

The auto update functionality just tells you that an update is available, you can choose not to install it. You can also independently verify that the sha256 sum matches the one given on the website, and that the binary that sha256 sum corresponds to is produced via the reproducible build instructions. There are occasional bugs (I'd estimate a couple times a year, though it's less and less frequent) that causes the reproducible build to not match the provided build, and it's quickly noticed by someone and an issue opened in the issue tracker. If there were no explanation or no quick resolution, people would publicly raise a stink about it.


> you can choose not to install it

There is a time bomb in there and servers will kick you out regularly unless you have updated.

If you get a patched client running you could probably change whatever string is required but some sort of action is required on the client side.


Sure, but that's an unrelated phenomenon to the security implications being discussed. The argument against auto-updates is "it's running code without my permission or ability to audit first"; putting a recency requirement for client-server communication doesn't impact that concern, and I don't see any reason why it would be considered a bad thing.


> The argument against auto-updates is "it's running code without my permission or ability to audit first"; putting a recency requirement for client-server communication doesn't impact that concern

It makes it impractical to actually audit the code you're running, because you're forced to re-audit on Signal's schedule. And it makes those audits mostly meaningless: what are you going to do if you decide a given code change is suspicious? You can't keep using the version of the code you were happy with, so you'd better have a plan in place for moving off Signal quickly - but in that case how much can you gain from using it at all?


> It’s closed source. No one has a clue what it’s doing

This is just bullshit. If you have access to the binaries you can find out what the software does.


Well, WhatsApp client does have access to the unencrypted chats.

Not saying this happens, one possibility can always be to send encrypted traffic to WhatsApp servers while opening a second unencrypted channel to govt servers if a govt asks for it.


Terms and conditions were updated a few weeks ago. The fact that the key will only remain on your phone has been removed from them.


I'm surprised that Tor isn't integrated already. Moxie was pushing that at Twitter - a prototype was even built.


These blocks tend to be reactive, so if a blocked app starts using Tor, Iran will block Tor fairly quickly. So in return for a short-lived regain of the use of Signal, all of Tor gets blocked.

This proxy arrangement is better because folks who start them tell their friends in Iran, who tell their friends, but it isn't listed publicly, like most Tor entry nodes are. When the authorities find a proxy and block it, they only disconnect a subset of Signal users, who hopefully have other proxies they've learned from other friends or friends-of-friends. So now the blocking is trying to put out a thousand small fires that they have to find one-by-one.

<I have consulted for the Signal Foundation in the past, but not recently and haven't talked with anyone there about this>


Tor bridges work in the same way as this proxy; except Tor operates a centralized way of getting access to them.


Blocking tor exit nodes is considerably easier than an arbitrary proxy server. Tor provides a list, in fact.


No, it's the opposite—if Signal wants exit nodes, they obviously won't block them. It's the entry nodes that need to be blocked. Some are easy to find, but others require you to send an email from a unique email address from a trusted provider to get lists of IPs.


Using onion services doesn't use exit nodes, that's only for exiting to the public internet.


I have a proxy up at https://signal.tube/#s.bpj.net

If you can help share more proxies to people who need them, please send me an email (in my HN profile).


I'm surprised Tor isn't blocked, since it's pretty easy to block it, but if it's not, you can always tunnel your entire phone through Tor, which would include Signal. Do keep in mind, that depending on your threat model, you might want to separate your apps across multiple devices or at least accounts (I mean like Android user accounts), so only some go through Tor (see the Silk Road case for why), but that also equally applies to VPNs.

I don't remember what it's called, but I think the app is official by the Tor devs and basically makes a local VPN that your phone connects to and then forwards all traffic through Tor. It was on F-Droid last time I checked.


> so only some go through Tor (see the Silk Road case for why), but that also equally applies to VPNs.

Can you explain this?


I looked into the Silk Road story again and it looks like I was misremembering how they caught DPR, but splitting your "personally identifiable" and other browsing is still a good idea.

Let's say you use the Tor browser to browse some regular (non-Tor) site that is illegal in your country for whatever reason. But let's say you then remember you still haven't paid your taxes so you open a new tab and quickly go do that. But you're still in the Tor browser, so your e-banking traffic is going out the same exit node as your "illegal" traffic. Now, anyone that saw both of those things come out of the same node can conclude that it's somewhat likely both were done by the same person. If that someone is the government, they can get access logs from your bank and see which account was accessed by the exit node's IP. The more times you do this, the stronger the link between you personally and the illegal site is.

Of course, doing your taxes through the same Tor session is something most people would know to not do, but if your entire device is tunneled through Tor, you no longer have a say in what data it leaks. Your banking app probably sends requests periodically in the background to check for updates or whatever, your email client syncs your emails, etc. If any one of those services can be coerced by your government (and chances are they can) then whatever illegal things you do in that session can be loosely linked to you. I say loosely, because there are many people on one exit node, but the data points start adding up after a while (and depending on the insanity of your leaders, just being on the list of candidates might be enough to disappear you).

As for how they would get that metadata in the first place, there are a few ways. The exit node might be under their jurisdiction, but since we're talking about bypassing censorship, it certainly isn't. They could also have compromised the "illegal" server (hacked/coerced/honeypot...), in which case it's just a matter of cross-referencing the site's logs with anything they can get their hands on (and if the government is authoritarian enough, they probably already have access to a lot). The last option is compromising the exit node, which is also not impossible. There's nothing stopping your government from setting up a thousand Tor exit nodes and logging all the metadata. If you're constantly running Tor, chances are you land on one of their exit nodes eventually.

DISCLAIMER: the above was probably a bit too paranoid, but as I have zero experience hiding from an authoritarian government, I'm not in a position to judge how much paranoia is justified. It's entirely possible that none of this applies because your specific adversary doesn't employ these specific de-anonymization tactics, but that is something you need to know for your specific situation. I assumed an "everything is fucked" threat model here, but yours might not be as severe and other types of mitigations might be more appropriate.


> Let's say you use the Tor browser to browse some regular (non-Tor) site that is illegal in your country for whatever reason. But let's say you then remember you still haven't paid your taxes so you open a new tab and quickly go do that. But you're still in the Tor browser, so your e-banking traffic is going out the same exit node as your "illegal" traffic.

That isn't how Tor works. Tor creates a new circuit for each new host you connect to, and they also create new circuits for the same host fairly regularly (every 15 minutes I think) -- both of which are done specifically to avoid this precise attack.

I also don't have experience dealing with an authoritarian regime, and there are many more aspects to OPSEC than just using Tor (after all, Tor doesn't look like normal internet traffic unless you use obfuscators -- so an authoritarian regime can just target all Tor users, which is why having Tor be used by more people is important for improving anonymity). But Tor has already dealt with obvious attacks like the one you outlined.


Honest question: Is using Tor not a risk by itself in Iran? I wonder if this doesn't pop up under surveillance mechanisms as conspirative behavior and may trigger focussed surveillance, which is hard to get out of.

But maybe so many people in Iran use Tor that it's not very outstanding to use it? I remember there were stats on that published on the Tor Project Website...

Edit: To answer myself after 5 minutes of thinking: Of course there are bridges, too. I guess they don't appear as suspicous as regular entry nodes?


>I have a suggestion for signal team: please put tor in the signal, tor is better than any proxys or vpns.

You can get a tor proxy for Android at

https://guardianproject.info/apps/org.torproject.android/


Love back!


Thx Sherwin! Just out of curiosity: is iMessage working ok in Iran?


Not the OP but, I can confirm that iMessage works perfectly in Iran; However, because of both economical situation (inflation and the higher price of iPhone comparing to average Android phones) and the fact that local companies cannot release their apps on the AppStore, only a small portion of people use iPhone or in this case iMessage.


I’m not Iranian, but I spent a lot of time over this for a friend. iMessage works ok once you are able to activate it. But Apple insists on contacting “init-p01md.apple.com” with plaintext HTTP, this sometimes connects successfully, but often it doesn't.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: