Lumo is powered by open-source large language models (LLMs) which have been optimized by Proton to give you the best answer based on the model most capable of dealing with your request. The models we’re using currently are Nemo, OpenHands 32B, OLMO 2 32B, and Mistral Small 3. These run exclusively on servers Proton controls so your data is never stored on a third-party platform. Lumo’s code is open source, meaning anyone can see it’s secure and does what it claims to. We’re constantly improving Lumo with the latest models that give the best user experience.
Running those small models is usually not a problem for SME or homelabs. Serving full Kimi K2, Qwen3 or Deepseek V3/R1 under the Proton conditions would be an interesting offer.
I believe Apple provides guarantees that data access is impossible under most circumstances, create auditable, cryptographically secure hardware logs and allow for third-party inspection of their facilities to ensure compliance with their own stated design and protocols.
> the system doesn’t even include a general-purpose logging mechanism. Instead, only pre-specified, structured, and audited logs and metrics can leave the node, and multiple independent layers of review help prevent user data from accidentally being exposed through these mechanisms
> We consider allowing security researchers to verify the end-to-end security and privacy guarantees of Private Cloud Compute to be a critical requirement for ongoing public trust in the system
> Private Cloud Compute hardware security starts at manufacturing, where we inventory and perform high-resolution imaging of the components of the PCC node before each server is sealed and its tamper switch is activated. When they arrive in the data center, we perform extensive revalidation before the servers are allowed to be provisioned for PCC. The process involves multiple Apple teams that cross-check data from independent sources, and the process is further monitored by a third-party observer not affiliated with Apple. At the end, a certificate is issued for keys rooted in the Secure Enclave UID for each PCC node. The user’s device will not send data to any PCC nodes if it cannot validate their certificates.
> Every production Private Cloud Compute software image will be published for independent binary inspection — including the OS, applications, and all relevant executables, which researchers can verify against the measurements in the transparency log. Software will be published within 90 days of inclusion in the log, or after relevant software updates are available, whichever is sooner. Once a release has been signed into the log, it cannot be removed without detection
> Additionally, PCC requests go through an OHTTP relay — operated by a third party — which hides the device’s source IP address before the request ever reaches the PCC infrastructure
I'm not saying it's an infallible system. Just relaying what Apple themselves announced.
That only says that Apple self-certifies as being open for audit and that they don’t get any of this data. Who is keeping an eye on that externally though? For every release?
I don't know. They posted this about a year ago and some language was intentionally vague ("third-party") presumably because they were still selecting partners. Not everything was implemented at the time. Hopefully we get an update soon about the status of their private datacenter and more information about the auditing process. As it stands now, supposedly a third-party reviews new machine provisioning, and for releases security researchers will be able to cross-check transparency logs and use cryptography to ensure the binary running on the machine is what Apple says it is.
I think it's a pretty advanced and thoughtful approach, but it definitely has its limitations. Hopefully Apple iterates on this over time.
Between you and me, though, it's hard to tell if Apple's ostensible commitment to privacy is just theatre due to the locked down and user-hostile nature of their operating systems.
The rest of iCloud is quite open by default though. It’s a lot simpler to just get the data from there than to try to access the private cloud context used by Apple’s models.
It’s funny how when it’s Apple, everyone is happy to defend even the most incomprehensible decisions with “privacy as a feature”. For everyone else apparently privacy doesn’t count. I think “Donald Trump can’t get your photos” is a pretty good selling point.
> everyone is happy to defend even the most incomprehensible decisions with “privacy as a feature”
Not me. I care about privacy and I know they care about privacy, but what I want to see is that they have a product in the first place before all those other things.
In fact, I more or less knew Apple wouldn't ship a good product when all they talked about was privacy instead of providing any meaningful data about performance. Turns out it's all just vaporware.