All elementary functions from a single binary operator
Posted by Dear-Economics-315@reddit | programming | View on Reddit | 29 comments
Posted by Dear-Economics-315@reddit | programming | View on Reddit | 29 comments
hl_lost@reddit
the NAND comment was exactly my first thought too lol. but yeah the continuous function angle is wild — it's like someone proved there's a universal lego brick for all of calculus
honestly the symbolic regression implications are probably the most exciting part. if you can compress your function search space down to compositions of one operator, that's a massive win for automated equation discovery. could be huge for ML interpretability stuff
Kok_Nikol@reddit
This is waaay over my head, but I have a question, is this similar to mov being Turing complete? (https://drwho.virtadpt.net/files/mov.pdf)
cscottnet@reddit
Yes, in a way. As the paper states, if you want to prove a system can compute any computable function, it is sufficient to show the system is turing complete. Similarly, if you want to show some property holds for all "simple expressions", then it is sufficient to prove that it holds for the single EML operator and 1, since all simple expressions can be reduced to trees of EML and 1.
Kok_Nikol@reddit
Thank you!
HarterBoYY@reddit
This sounds potentially revolutionary for analog computing. So if we manage to construct a high-efficiency electrical EML gate, we can plaster a chip with billions of these and any formula just becomes a topology (wiring diagram). That means we can do AI inference, fluid dynamics, Fourier transform, etc. on the same chip at a tiny fraction of the energy cost. We'd still have a precision problem and e^x saturates quickly, but it's pretty interesting work nonetheless.
araujoms@reddit
That's the craziest paper I've read this year. Result is fascinating and completely unexpected.
looneysquash@reddit
At what point would you say it becomes unexpected?
Some aspects obvious, at least in hindsight.
A function composed of exp() and ln() can produce exp and ln. That's not surprising. Oh and the constant e. Well obviously.
Sin and cos? Well, of course, that's just exp with a complex exponent.
Oh, and it can do arbitrary powers. I remember something about it being able to do that from calculus.
Somewhere after that it becomes more surprising.
araujoms@reddit
I didn't expect to get sums and products of functions.
bobogei81123@reddit
It has both exp & log so you can easily cancel them out
It has subtraction and once you get x - y you can do x - (0 - y) = x + y
log turns sums into products
trigonometrics functions are just some combinations of exp(ix)
And because log is the inverse of exp, I think it is expected that you can get inverse trigonometric by piecing logs & exps together
araujoms@reddit
Ah yes, you can get the inverse trigonometric functions by using the complex logarithm: asin(x) = -i log(ix + sqrt(1-x^2 )).
dethswatch@reddit
ok… I’ll do ask- anyone want to explain it?
syzygyhack@reddit
Slop.
Very telling comments.
Blue_Moon_Lake@reddit
Is there a page that show step by step how all functions can be rewritten?
kabocha_@reddit
I was looking at https://github.com/VA00/SymbolicRegressionPackage/blob/master/rust_verify/verify_eml_symbolic_chain.wl, don't know if there's a nice table somewhere.
cscottnet@reddit
Page 13 and 14 of the supplement give the proofs: https://arxiv.org/src/2603.21852v2/anc/SupplementaryInformation.pdf from which you can reconstruct any function.
Yes, it's a shame we didn't get a table with the fully expanded results, but some of them are quite large.
funtimes-forall@reddit
Does this have any practical use?
kabocha_@reddit
I've only read the abstract and not the whole paper yet, but this sounds pretty useful:
Beyond that, the first thing I imagined was maybe being able to eke out some niche optimizations with SIMD or something that you might not be able to get otherwise (eg: by packing "eml" operations together where you otherwise wouldn't be able to with distinct functions), but I haven't thought of a concrete example.
saf_e@reddit
Not sure as for practical use. Ln is very slow function and e^x is not a quick one also. And for many use cases you need to call them many times. So, most of the times you already have a better option.
kabocha_@reddit
Read all of the paper now and played around with the GitHub repo a tiny bit.
I probably agree, and was surprised how quickly those eml trees grew on top of each other.
If we heavily optimized CPUs for running exp and ln as a result of this paper, it still probably wouldn't be worth it just 'cause of how quick those binary trees grew.
Cool concept either way, though!
funtimes-forall@reddit
Thanks! That's a very insightful idea.
pavelpotocek@reddit
It is useful for proofs: if you want to prove a property for all "common functions", you can just prove it for eml.
It can also simplify construction of the "common functions": in hardware, software or in proofs. Again, just one function is enough to construct, instead of multiple.
It is similar to what other simple models do for other branches of mathematics: Turing machines and lambda calculus for computation, SAT for complexity analysis, etc.
ST0PPELB4RT@reddit
Thanks. The "for all common functions" part for proofs is really interesting
sweetno@reddit
No Bessel function, useless! ^\s
Inevitable_Exam_2177@reddit
Wake me up when it can do elliptic integrals too
(Also /s… I think this is super weird and cool)
Although I’m curious, the complete elliptic integral functions also decompose into logs and sinosoids, I wonder if the eml function could be rewritten like that
___Archmage___@reddit
I was gonna say "doesn't everyone already know NAND can create everything?", but after reading the article, the difference is this new discovery applies to continuous functions
Lowetheiy@reddit
I am not sure what problem this is trying to solve
Lucas_F_A@reddit
This isn't trying to solve a problem. A lot of maths does that
afl_ext@reddit
Time to try this crazy stuff in my also crazy logic circuit simulator
tomster10010@reddit
That's crazy, really cool stuff. Allegedly big implications for symbolic regression, although I don't know what that is used for