Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Funny thing is

Since neural networks are differentiable, they can be homomorphically encrypted!

That’s right, your LLM can be made to secretly produce stuff hehe



That's pretty cool, but isn't any computable function can be computed via FHE, so I'm not sure the differentiable part is necessary.


Any program which you apply FHE to needs to be expressed as a circuit, which implies that the time taken to run a computation needs to be fixed in advance. It's therefore impossible to express a branch instruction (or "if" statement, if you prefer).

The circuits are built out of "+" and "×" gates, which are enough to express any polynomial. In turn, these are enough to approximate any continuous function (Weierstrass's approximation theorem). In turn, every computable function on the real numbers is a continuous function - so FHE is very powerful.


> In turn, every computable function on the real numbers is a continuous function

That doesn't seem right. Consider the function f(x: ℝ) = 1 if x ≥ 0, 0 otherwise. That's computable but not continuous.


That's uncomputable because equality of real numbers is undecidable. Think infinite strings of digits.


Differentiability isn’t a requirement for homomorphism I don’t think.

Homomorphism just means say I have a bijective function [1] f: A -> B and a binary operator * in A and *’ in B, f is homomorphic if f(a1*a2) = f(a1)*’f(a2). Loosely speaking it “preserves structure”.

So if f is my encryption then I can do *’ outside the encryption and I know because f is homomorphic that the result is identical to doing * inside the encryption. So you need your encryption to be an isomorphism [2]and you need to have ”outside the encryption “ variants of any operation you want to do inside the encryption. That is a different requirement to differentiability.

1: bijective means it’s a one to one correspondence

2: a bijection that has the homomorphism property is called an isomorphism because it makes set A equivalent to set B in our example.


ReLU, commonly used in neural networks, is not differentiable at zero but it's still able to be approximated by expressions that are efficiently FHE-evaluable. You don't truly care about differentiability here, if you're being pedantic.

Very insightful comment, though. LLMs run under FHE (or just fully local LLMs) are a great step forwards for mankind. Everyone should have the right to interact with LLMs privately. That is an ideal to strive for.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: