What if everything was "Async", but nothing needed "Await"? -- Automatic Concurrency in Par
Posted by faiface@reddit | programming | View on Reddit | 124 comments
I made a new video, showcasing and explaining the "automatic concurrency" in the Par programming language!
I think this is the first time I actually manage to convey this unusual, but absolutely foundational feature of my language.
In the video, I walk through a "concurrent downloader" application, visualize how it's put together, and explain how Par's concurrent evaluation makes it all work.
I'm very curious to hear what you think!
And if you don't know, Par is an innovative (and WIP) programming language with linear types, duality, automatic concurrency, and more.
Yesterday's discussion in r/ProgrammingLanguages: https://www.reddit.com/r/ProgrammingLanguages/comments/1ozlvuw/what_if_everything_was_async_but_nothing_needed/
lukaslalinsky@reddit
There is a much more intuitive way of using asynchronous code without await - virtual threads / green threads / fibers.
It works well in Go and Java. I recently made a framework for Zig, but there are many other runtimes like that for C++ and it's gaining popularity even in Rust.
seweso@reddit
I'm very very much a fan how async await is currently implemented in different languages.
I mean, in C# i can write assembly and complex async await code in the same file even.
Okay, now sell it to me. Because i see not problem to your solution...yet.
Luolong@reddit
Actually, while async/await are a significant quality of life improvement over what was before, I don’t for a minute believe that this is the pinnacle of the concurrent programming.
For one, the function colouring is definitely an issue. But also, there are better approaches.
Java virtual threads for one, offer near seamless concurrency with green threads and work stealing without explicit function colouring.
Blue_Moon_Lake@reddit
I'm a firm believer that the next step is to invert the logic.
Everything is implicitly
async/awaitif needed, unless you explicitly want something to besync.So you don't need to propagate the change in your codebase when suddenly a function become asynchronous.
And when you want to do some parallel promise handling, you're required to collapse the type from
T | Promise<T>(which we could callAwaitable<T>) toPromise<T>by using a wrapper function.Luolong@reddit
I don’t think this is necessary.
For one, most code is naturally synchronous. Think of most computation- or memory bound tasks.
Only IO bound tasks require (implicit) asynchrony, yielding control to the runtime until it can be safely resumed once IO completes.
That is 90% of code can still be safely synchronous without any need to become overly complicated or incurring undue runtime overhead.
There’s probably a handful of patterns that require some level of explicit asynchronous execution but those are few and far between.
Blue_Moon_Lake@reddit
I don't know what kind of projects you're doing, but most of mine would benefit from being automatically switching between sync and async implicitly when needed.
There are a thousand
asyncin the codebase I'm currently working on, almost 2 thousandawait. Not once have I needed to handle parallel promises manually.At least twice have we needed to update a lot of files for just one thing becoming asynchronous.
DLCSpider@reddit
Where, outside of web servers, do you need async in large quantities?
Async APIs infect a lot of stuff where you don't need it but in those cases the correct thing to do is to just block on them and not make everything else async.
Blue_Moon_Lake@reddit
Good luck not needing to access any DB, caching, file system, nor third-party API.
DLCSpider@reddit
That's exactly my point. Let's say I'm writing a script which reads and parses a file and forwards the file content to a DB or web endpoint. Lots of async APIs. Think about what happens when you execute that script.
Implement async properly: your runtime puts the script to sleep at every await point until it can continue.
Block on async: your operating system puts the script process to sleep at every await point until it can continue.
The result is exactly the same. The script is not starved of threads because it only needs one. There is no UI which hangs because it doesn't have one (and even if it has: worker threads are a good pattern to implement anyway).
Blue_Moon_Lake@reddit
Yes, but now imagine that your nodejs server have 50 functions/methods that are synchronous and now you add one async call in it. Now you need to change these 50 functions/methods to be
asyncbecause it's not implicitly handling that change.Also, I can't afford to have the whole thing use the synchronous I/O and grind to an halt while it's waiting for the ability to get a file lock, it can process other things in the meantime.
Luolong@reddit
This is precisely why Java solution is so goddamn ingenious.
You use usual sync IO api in your code as usual. Only when running on a virtual thread (which you need to set up separately if you need it — or which is set up by a web framework for you), will the sync IO api be translated to async IO by the VM/SDK — your virtual thread will yield its control until IO returns at who point it will be resumed and program flow continues where it left off. Meanwhile, the OS carrier thread can continue running other virtual threads, speeding up overall progress considerably.
The genius of this approach is that from the programmer’s point of view, your code remains synchronous all the time and implicit async kicks in only when execution context (thread) supports it.
Blue_Moon_Lake@reddit
Yes, but JS doesn't have that and it won't help when I'm doing backend code that must run on NodeJS.
Luolong@reddit
Sure, but I was not aware we were talking about JS features. I thought this thread was about alternate models of dealing with asynchronous code, where async/await style programming model (implemented in many languages) was one of the options.
Blue_Moon_Lake@reddit
There are so many better ways to do it in other languages that I consider discussions about
async/awaitto be about JS.Luolong@reddit
Not just JS. C# aas the first language to introduce async/await syntax for async programming at language level.
JS (and TS) were next, but same form of async/await has been introduced in many other languages as well — Python and Rust being probably more prominent ones.
This is why I brought up Java’s model as an alternative to async/await style asyncrony.
Getting back to OP, Par seems to go to the other extreme and makes everything implicitly asynchronous by default.
seweso@reddit
I'm not against automatically adding awaits, or even marking all functions in a subtree/namespace as async.
But then your IDE must show this with colors or something, so you don't assume its sync when it isn't (or vice versa).
Blue_Moon_Lake@reddit
Why would you need that? If you absolutely want it to be synchronous, you flag it as such with
sync.seweso@reddit
Because in some project most if not all code is sync.
I would also expect this to automatically work.
In the sense that a func become async if it calls something async
Blue_Moon_Lake@reddit
But it would, if you don't throw an async call somewhere, everything would be implicitly synchronous.
seweso@reddit
That would be very weird for compiled languages. Then you need to split a function into very small blobs of compute which does almost nothing.
Probably also OP's problem...
Blue_Moon_Lake@reddit
Why would you need to do that? If you put async stuff in, you can't expect it to behave in a pure synchronous fashion.
And for compiled language, it would probably be more efficient as they can explore the entire call tree to handle when they need to handle something asynchronous and when it can be streamlined with implicitly synchronous code.
Luolong@reddit
In Java, the virtual threads allows writing fully synchronous code without ever needing to worry about IO being async.
Writing async code in this fashion does not differ at all from writing regular synchronous code. The only difference is that the code running on virtual thread handles IO in asynchronous manner.
Running same code on regular platform thread uses synchronous IO.
The result is that you have easy path to take advantage of async programming — simply choose to run code on async threads instead of platform (sync) threads.
When you need explicit concurrency, you use usual platform capabilities by coordinating multiple threads of computation using usual apis of choice (threads, reactive, channels, etc). But that is extremely rare and specialised use case and 99% of line of business programming should stay in the comfortable synchronous style.
initial-algebra@reddit
Function colouring has never been the issue. The real problem is not being able to write "colour-polymorphic" code.
Luolong@reddit
True, there are languages which can be polymorphic over various effects (not just asynchrony), but they are mostly research languages with very litttle practical use.
grauenwolf@reddit
I like function colouring and would like to see experiments that take it further. For example, in T-SQL the compiler knows whether or not a function is deterministic and can emit different execution plans (or C++ code) based on that information.
Would C# benefit from this? I don't know but would like to find out.
-Mobius-Strip-Tease-@reddit
Algebraic effects and handlers are the way forward imo. I like to think of the function coloring problem as one that really is a function type problem. Koka has arguably one of the cleanest implementations of this idea that i have seen.
faiface@reddit (OP)
Okay, let me try and sell it to you.
So, with everything being fundamentally concurrent, basic types like lists (and more) can have long running concurrent processes behind them, producing them.
As a result, I can work with the concurrent structure of my application just like I would with data structures. I can store "channels" (just types) to concurrent processes in maps, lists, and just manipulate them. The types, while looking normal, actually define the communication boundaries.
And with the linearity and other rules, I can't leak them and I can't deadlock them. So this scales without needing to worry about unhandled channels, leaked coroutines, ignored messages, and so on. Just can't happen.
seweso@reddit
Sounds like syntactical sugar to me. Which ultimately is about readability of code. But seems to me that it also adds magic, and if the abstraction layer leaks....needs debugging...you will not be happy.
Does your language fix all async issues? Does everything magically work? Is this for juniors atm, or senior devs?
faiface@reddit (OP)
It definitely does add “magic”, however I’m pretty confident that the abstraction layer doesn’t leak. With everything being concurrent, and at the same time with no way to forget about obligations and no way to leak coroutines, everything kinda snaps together.
Does it fix all async issues? I don’t know, you need to tell me which ones you have in mind.
seweso@reddit
Does it throw an error if there is a deadlock? Is it easy to work with and pass around task and promise objects? Can properties be async?
Anyway, i checked out some of your examples. I'm not impressed by the readability.
If current programming language are equivalent in terms of features. And your programming language is less readable / usable. Then it doesn't really serve a purpose.
You seem to be going for brevity. But i'm not sure that's smart.
faiface@reddit (OP)
Deadlocks are statically ruled out before your program runs, so no need to deal with them. Of course that comes at a cost of imposing some restrictions on you.
Couldn’t be easier since there are no explicit task or promise objects. Everything is concurrent, so everything is a “promise”. A “String” is a “promise of a string”, etc. That’s why just a basic list essentially acts as an async iterator, and you can even “fan in” lists, like shown in the video.
Additionally, the types are expressive enough to express various communication protocols, the basic types act doubly as session types.
That’s a fair point, the syntax is foreign, but I have to say, it’s not foreign artificially because
Par is not. It has different features than almost all current programming languages. You can find analogies, of course, but it really is different. The features are mainly designed based on the underlying logic: classical linear logic. And it offers a very unique paradigm.
Some features that are unusual: - Linear types as a basis - Duality and “construction by destruction” (very useful btw) - Choice types - Process syntax (enables fluid interfaces that include output and handling multiple options) - First-class generic values - Distinction between recursive (foldable) and corecursive (unfoldable) types - And more
I have tried to make the syntax as pleasant as possible, but also consistent, and not obscuring real symmetries. I’m not saying it couldn’t be done better, but I haven’t seen it done better, especially in research papers.
seweso@reddit
I'm more intrigued, thanks for the answers.
How do you NOT await something? Does an array with 100.000 strings actually contain an array with 100.000 promises pointing at strings?
How do you promises which resolve to promises? How do you do stuff like Promises.race()?
And if you only prevent deadlocks at compile time. That means I can't even write the kind of code where deadlocks are caused at runtime.
I'm not entirely convinced that you can still write the same applications with your programming language.
And would love to see comparisons of real life coding issues and compare between async code in ts, c#, java for instance.
Have you considered writing a transpiler/weaver instead? And output .net bytecode or java bytecode instead? What is the advantage of starting from scratch here? Is your language akin to what rust does for memory management? Does it (will it) prevent a whole class of bugs?
I mean, in 25 years i haven't ever had a deadlock in production. I've had production issues because tasks couldn't efficiently fan-out (locking). Or memory issues because data isn't streamed in some input or output pipeline (async iterators).
Is it research or development? ;)
c-digs@reddit
That's just an
asyncenumerator, no? I don't find these particularly hard to use in JS nor C#faiface@reddit (OP)
Yes, an async enumerator accomplishes the same as a list in Par.
The difference is that in Par, that’s just the list that you can definite yourself in 4 lines of code.
Suddenly want a “list” but one that takes a value back after producing one (so bidirectional communication) and have it the same concurrent?
Easy, just add
[Input]after every item in the type and that’s it!So the thing is, in Par, you can easily define all of these yourself and adapt them to your usecases.
grauenwolf@reddit
I want to know when my code is jumping between threads. A lot of things are tied to the thread.
EC36339@reddit
I think in an
asyncfunction,awaitshould be implicit. If you are going to add syntactic sugar, do it properly.If you want the
Taskobject (or future or whatever), then there should be an alternative syntax for it, or you could call a sync function which calls an async function to get the task/future.grauenwolf@reddit
Task isn't just used for asynchronous operations. It's also used for parallel, CPU bound operations.
EC36339@reddit
That's repurposing (as opposed to reuse), which is usually bad design.
grauenwolf@reddit
No, that's the original purpose. Async/await came later.
EC36339@reddit
It's still repurposing.
grauenwolf@reddit
I'm here to tell you the history of feature, not to play semantic games.
EC36339@reddit
You said "no" when I said it was repurposed.
It doesn't matter what it was used for first.
faiface@reddit (OP)
That’s very understandable if your focus is systems programming.
Par is explicitly not a systems programming language, I think there’s enough of those.
Instead, Par is an applications programming language.
chucker23n@reddit
One of the canonical .NET async/await examples is this:
This is a lightweight yet effective way of knowing, "I can safely do tasks in a separate context yet come back to the UI context, regardless of whether 'context' means a different thread or not".
grauenwolf@reddit
That's the key part. People need to understand that await doesn't say anything about where the code that is being awaited will run. Maybe the same thread, maybe a different thread, maybe there is no thread at all (e.g. file operations).
Wooden-Estimate-3460@reddit
One of the benefits of explicitly awaiting is you can choose to kick off (child) tasks in parallel (start multiple and then await all). This is desirable even when not doing systems programming.
faiface@reddit (OP)
To clarify, Par doesn't automatically insert await everywhere. It's quite the opposite. It just doesn't block on anything.
So literally if I write this:
All 3 functions will run concurrently. And the program just goes on.
grauenwolf@reddit
That's not always a good thing.
In .NET, you actually get better single-threaded performance if you block. So for stuff like bulk data loaders, I don't use async/await.
In terms of performance, asynchronous code doesn't really become beneficial until you've saturated the thread pool and are paying for the context switching. (There are still other reasons for it such as not blocking the UI thread.)
faiface@reddit (OP)
That's very true that it's not always the best for performance. But Par is exploring the expressivity side of this. Asking the question, "what is programming like if it works this way?"
And so far, it looks like it brings a lot of composability. If that pans out as such, there's a lot of space and time for optimizing for performance.
grauenwolf@reddit
I write banking software on Windows. In addition to the UI thread, some types of locks are thread sensitive. And of course thread local storage is affected. I also need to ensure that long running tasks aren't using a thread pool thread.
I get the desire to not want to think about concurrency, but it's important to know what's going on.
aweyeahdawg@reddit
And it’s not like it’s super hard, if you’ve been doing it for a few years you don’t even think about it anymore. This seems to be geared towards new programmers who can’t seem to dedicate the time to learn how to do it themselves.
johndoe2561@reddit
Lol. Tell that to the hordes of TS/JS devs that can't remember to await promises without an eslint rule.
faiface@reddit (OP)
Par is definitely not intended for new programmers, its concepts like duality will mess with your mind more than when you first learned functional programming :D
But they do bring a lot of expressivity, it’s just a very foreign paradigm. Par is closer to pi-calculus, than to anything else.
Ran4@reddit
Most of the time people do not need that.
grauenwolf@reddit
That really depends on the type of application. If I'm building something with a user interface, all of the frameworks I use require knowing if I'm on a UI thread.
If I'm building a thin Web API wrapper around a database, then I agree that I can largely ignore it.
yawkat@reddit
The concept of having a dedicated UI thread exists for concurrency control in an imperative programming model. It makes no sense for a language like Par. You would design a UI framework differently.
grauenwolf@reddit
Uh huh, and if you wanted to visit Alpha Centauri you would just design your spaceship differently. Don't bother with thinking about 'how' you would design the spaceship. Those details are trivial.
Stunning_Ad_1685@reddit
The Pony language has the best answer to the question of “when is my code jumping between threads” (:
grauenwolf@reddit
You'll have to explain that one to me.
Ahri@reddit
I assume they're talking any Pony because it has the Actor Model baked in so you know what thread you're in because you are in (effectively) an opinionated framework elevated to language level. Disclaimer: I haven't used it, just spotted it a while back and thought it [looked interesting] (https://www.ponylang.io/discover/).
RandomName8@reddit
Most comments on this thread are so confused by the tree in front of the forest...
faiface@reddit (OP)
Could be my fault as well with the “async/await” framing. But I gotta say, the overall reception here is better than what one could expect from r/programming when it comes to novelty :D
RandomName8@reddit
I think a comparison to a compiler's read/write instructions reordering by establishing a data dependency tree would be better.
faiface@reddit (OP)
Perhaps, but there’s an important distinction. What you’re describing happens at compile time, but what Par does happens at runtime because the order depends on timing as well.
HappinessFactory@reddit
Wait what if I don't want to await?
Or do I not understand the premise
faiface@reddit (OP)
It doesn’t automatically await. Instead, it runs everything concurrently, automatically.
HappinessFactory@reddit
Now I'm even more confused. Doesn't every language that supports some version of promises work like that?
Like without the
awaitkeyword JavaScript will continue within it's execution process.Maybe I just need to watch the video
faiface@reddit (OP)
Of course, but you need to get the values at some point and synchronize properly.
I think watching the video will be the most illuminating ;) But feel free to ask any more specific questions before or after!
ballinb0ss@reddit
Hey lots of people tearing this apart but I think this is a neat idea.
faiface@reddit (OP)
Thanks, appreciated :)
psychelic_patch@reddit
OK. Make me an Treiber Stack and show me how you actually deal with ABA problem please.
andlewis@reddit
I would absolutely enable the configuration to automatically await async methods without the await keyword in dotnet. I’d also be supportive of adding a keyword to explicitly not await instead of the current situation.
Basically reverse the await syntax, make it opt-out rather than opt-in.
CookieOfFortune@reddit
Have you ever used LabVIEW? It also does this automatic concurrency.
Although it's not a particularly well designed language and thus can suffer from the same issues normal languages can. Eg. it doesn't have a particularly expressive type system.
MMetalRain@reddit
I think the concept is cool, but I would like it to be selective. For example I'm building basic website, having async at top level is nice, I can combine things together that resolve at different speeds. But I don't need that at lower level when I'm just joining strings together.
Boza_s6@reddit
Kotlin is mainstream language that doesn't need async await markers, although it color functions with suspend modifier. What's shown in the video seems like if all kotlin code would use suspend by default.
Interesting stuff
Kind-Armadillo-2340@reddit
It just awaits all suspend functions by default. You have to tell it not to await, which maybe makes more sense since usually you're awaiting most suspend functions, and there's a handful you want to dispatch and join together later.
fojam@reddit
One of downsides of an otherwise really nice language
startfragment@reddit
Death to two colored functions!
ElCthuluIncognito@reddit
Love these kinds of experiments. There was once a time when a programming language using garbage collection was considered an exercise in absurdity!
I’m curious, did you draw any inspiration from Fortress? I wonder how different your goals and approach are to that?
lethalman@reddit
You cannot avoid deadlocks when you deal with resources, for example when doing a distributed leadership election.
faiface@reddit (OP)
Internally, you indeed can. Just like Rust prevents memory violations despite being able to share memory, using its type system, Par does prevent deadlocks. They are simple not expressible, inside a Par program.
And externally, ie when it comes to network communication, you just have timeouts.
Smooth-Zucchini4923@reddit
When you say that "deadlocks are not expressible," what do you mean by that? Do you mean that Par doesn't have a lock primitive? Do you mean that a program can't hold two locks at once? Do you mean that Par forces you to acquire the locks in a specific order?
I can think of a lot of ways to skin this cat, but just saying it's not expressible doesn't really clarify anything to me.
faiface@reddit (OP)
Right, Par does have locks, although we're still exploring the best ways to make them as usable as possible. But for example, the "fan in" in the video uses a mutex to share a "producer object" of the merged list.
The way Par accomplishes this is by enforcing that the accessibility between concurrent units always forms a tree. That tree may be very dynamic and changing, but at any point, it is a tree. That makes it impossible to construct cyclic dependencies.
Smooth-Zucchini4923@reddit
Thanks, that's quite interesting.
lethalman@reddit
What I’m trying to say is that your claim is very broad in your website, it reads like a scam. If it avoids deadlock only when internal then say that.
faiface@reddit (OP)
I think my claim is accurate. I don’t consider a fetch request timing out a deadlock. The program isn’t stuck and proceeds.
And that’s true for Par in general, whether internally, or with external resources. The program won’t get stuck. If you start a server, an HTTP request will be responded to. The program will do a graceful shutdown if you request. And so on.
Or do you have a different definition of a deadlock than a program getting stuck and unable to end itself or a part of itself?
lethalman@reddit
Ok let’s do an example:
Program A needs to use two devices at the same time and need to be sure that once it starts using one then can also use the second: locks device 1 and then locks the 2nd.
Program B does the inverse, locks 2 and locks 1.
How does your language avoid a deadlock?
faiface@reddit (OP)
How does it acquire the locks?
Let’s say it acquires them by http requests and a 200 OK response means a lock acquired.
In Par, you can’t do an http request without a timeout. So one of the programs will see a timeout error returned from the request.
So it will print an error, or whatever behavior you put in for the lock failing, but it won’t deadlock.
lethalman@reddit
But then you are assuming the other webserver is removing the lock once the request is timed out? There you have a deadlock.
faiface@reddit (OP)
Okay, so let me get more detailed.
Both program A and program B will succeed in their first request/lock of the DB. All good.
Then both will time out on the second one.
But the type system is forcing them to deal with this situation. They must do something. They can try again, but not infinitely many times.
Eventually, they both will fail their overall operations. No deadlock.
lethalman@reddit
Ok then we just need to deprecate lock and just use lock+timeout in any programming.
faiface@reddit (OP)
I mean that would indeed get rid of most deadlocks, yes.
But I think it's still important that in Par, the internal deadlocks are ruled out completely, while only the external ones need to be ruled out with timeouts and failures.
Compared with almost any other language where neither is ruled out at all.
But, I'm not even saying that ruling out deadlocks is such a crucial thing. It's just one of the consequences of Par's overall paradigm. There's a lot more interesting there than just that.
lethalman@reddit
Your page though says it’s impossible, without specifying “internally”.
faiface@reddit (OP)
But they impossible are because you can’t do external requests without timeouts in Par.
And on the internal communication, including mutexes, you don’t need timeouts because those are guaranteed to not end up in a dependency cycle.
lethalman@reddit
Timeouts can lead to livelocks though, solving one problem and introducing another one
faiface@reddit (OP)
Glad we moved on from deadlocks to livelocks. Those are also impossible in Par, even with external interaction because of Par’s totality, and that including preventing infinite loops.
Once again, not constructible internally.
Repeatedly making the same request forever, so a livelock like that, is a form of an infinite loop, and so impossible.
Unless, granted, there’s an exception here because there is an escape hatch. Which is mentioned in the README. Something like “unsafe”, but for infinite loops. But if you don’t use it, you can’t cause an infinite loop.
Ravarix@reddit
So how can you have a webserver update a cache in the background without infinite loop? You want to make a call every minute to some datasource, and that datasource may be locked and you timeout. That resource may be locked forever, and your cache will get out of date, but you need to be able to express a livelock like that.
faiface@reddit (OP)
Well, currently you don't, not expressible at the moment.
But to that I gotta mention that it's not long since Par acquired capabilities to do HTTP, it's very much WIP.
What you're describing is definitely a valid use-case. The solution I envision is to be able to spin a timer repeating forever, and to be able to loop on that, but that timer would end (and let you know about it) on Ctrl+C or some other user input.
That way it wouldn't count as an infinite loop, however strange that sounds.
Ravarix@reddit
That is the same as every other languages infinite loop, you're just setting the end point to be the termination of the program.
faiface@reddit (OP)
It's not,
while true {}can't be cancelled. An infinite loop can't be stopped to perform a graceful shutdown with cleanup if necessary.What I described does properly cancel and a graceful shutdown is allowed.
Ravarix@reddit
So something like goroutine which is forced to handle a context signal. We already have those graceful shutdown options when desired. Forcing a `finally` construct doesn't feel too different.
lethalman@reddit
You have a webserver example, I assume that never terminates right?
Say every request in program A locks table 1 and 2. Every request in program B locks table 2 and 1. They timeout, rinse and repeat, making effectively no progress.
How is that solved?
faiface@reddit (OP)
Good question, but it has an answer. Of course, a web server needs to be able to handle an unbounded number of requests. Each request is an interaction with it.
We can make an even simpler example, we can have two programs sending each other's requests because they are programmed to always respond to a request with a request. They will indeed keep sending each other requests back and forth, even if they are both in Par.
So how is that not a livelock? Because those programs are not stuck. They are simply responding to each interaction, just like a button is not stuck even if you keep clicking it for million years.
They will still react to any request that arrives, so this behavior doesn't block them in any way. And they will also still react to a request for a graceful shutdown.
A deadlock or a livelock is such that it blocks program from making progress, that it or a part of it unresponsive, or impossible to end. That never happens in Par, whatever you do.
lethalman@reddit
You changed example.
faiface@reddit (OP)
The point being they can't rinse and repeat forever.
lethalman@reddit
Yeah they can’t if your system handles just one request per minute for sure
faiface@reddit (OP)
Of course I can't prevent slow programs. All my type system is doing is guaranteeing that every interaction with a program will eventually be answered. And that there always is a safe and clean way to shut it down.
lethalman@reddit
Ok makes sense
faiface@reddit (OP)
Thanks for the discussion :)
lethalman@reddit
Yes interesting for sure, just remove lock and keep only trylock in any programming language.
You do have infinite loops, by definition that web server is an infinite loop, there’s an escape.
faiface@reddit (OP)
They time out, rinse and repeat a finite number of times, making no progress, so they decide to send a 500 HTTP error response back.
There, that's the progress.
lethalman@reddit
I updated the example with db tables, maybe it’s easier to discuss
lethalman@reddit
As soon as you deal with external real life resources, be it another program, an http server, or a device that can only handle one user at a time, you will incur in a deadlock because you need to manage scarcity.
Inside your program yes I can believe that, but your page claims “impossible”!
faiface@reddit (OP)
Par can already do http requests, and be an http server. Yet I maintain you can’t create a deadlocking Par program, even if it does these.
So you’ll need to tell me where do you see the deadlock potentially occurring and I can tell you why it doesn’t occur.
lethalman@reddit
Yes you have timeouts because it’s part of the algorithm not because your language avoids deadlocks.
faiface@reddit (OP)
If you did program the whole distributed application in Par, where all calls, even distributed ones would be typed and checked by Par, it indeed could not have a deadlock.
It also could mean that the algorithm isn’t expressible! Or, depending on what features would be available, it could be expressible but only if you did add timeouts at the right places. Those features are not there yet.
But of course, if you have wholly independent programs communicating over untyped channels, then sure, timeouts are your only solution. Still, Par won’t let you write a program you can’t gracefully end.
lethalman@reddit
I can believe that, just like a language without recursion can terminate… except when they start making http calls.
Your page says it’s impossible, and you are losing my interest straightaway.
faiface@reddit (OP)
But tell me, where do you see the deadlock? Do you consider a request timing out to be a deadlock?
steve-7890@reddit
Slow by design?
faiface@reddit (OP)
In a way yes, but many successful programming concepts are like that.
Garbage collection, virtual dispatch, dynamic types.
We get expressivity by sacrificing maximum efficiency, which has clearly been successful in programming languages.
The question is only whether this one is worth it.
steve-7890@reddit
But in this cases it was about removing technical stuff that got into way.
With async/await (like in C# or TypeScript) I DO want to know which parts of code are heavy, which touch I/O. Because that's the part that need my attention.
lethalman@reddit
There’s already a language doing that, golang
amakai@reddit
Yeah, came to say this. Go literally started as a asynchronous processing POC and became a language on top of that.
Blue_Moon_Lake@reddit
Would make code way easier to update without needing to propagate the
async/await.By default having implicit
async/awaitautomatically added when needed.Unless a function/method is flagged as needing to be
sync, and a wrapper to force a result to be handled as a promise (even if immediately settled).