Note: I call the scientists in this post by last name, not because I think I am their 'peer' but because that's how the English language works, and if I put 'Mr' before every last name, I'll sound like Consulla asking for Lemon Pledge! I am 30, will turn 31 in less than 20 days. I am the same age as the year of the Eternal September. I was born with web, but I hope Web dies before me!
Also, if you don't know who Alan Kay is, don't be distraught or feel like you're an 'outsider' (especially if you are not much into the 'science' side of programming). Just think he's a very important figure in CS (he is, you could look him up perhaps?)
Now, let me explain what Kay told me.
Basically, Kay thinks WWW people ignored works of people like Englebert and the NLS system, and it was a folly.
Dough Englebert, before UDP was even though of and TCP was a twinkle in the eyes of its creators, before Ethernet was created, back when switches were LARGE min-computers and ARPA was not DAPRA, tried his hand at sending media across a network (the aforementioned ARPA, you may know it from /usr/include/net/arpa), he even managed to do video conferencing. That was in the 1960s! He came up with a 'protocol' and this 'protocol' is not a TCP/IP stack we know today, it was completely different. I don't think he even called it that. The name of this 'protocol' was 'Online System' or NLS.
Englebert's NLS was different from the 4-lays abstraction we know and love today. It was different from web. It was like sending 'computations' across. Like a Flash clip! Kay believes that, WWW people should not have sent a 'page', they should have sent a 'process'. He basically says "You're on a computer, goddamit, send something that can be computed! Not plaintext!"
Full disclosure, yes, Kay is too brutal to Lee in this answer. I don't deny that. And his 'Doghouse' analogy is a bit reductive. But I digress.
Kay believes the TCP/IP stack is sound. I think anyone who has read a Network Theory book (like Computer Networking A Top-Down Approach which I have recently perused through) doesn't subject this. But he believes people are misusing it, abusing it, or not using it right.
In the speech which I am referring to at the question title, Kay said:
[Paraphrasing] "This is what happens when we let physicists play computer [...] If a computer scientist were to create 'web', he would do a pipeline, ending at X"
X refers to X Windowing System used in UNIX systems, it's a standard like Web is. The implementation of X11 on my system is Xorg. It's being slowly replaced by wayland.
So what does this mean? Well Kay says, 'send a process that can be piped'! Does it sound dangerous and insecure? WELL DON'T ELEVATE ITS ACCESS!
Imagine if this process-based protocol was too, called web, and the utility to interface with it was called 'wcomm', just like wget is. To view a video:
Imagine PostScript was strong enough to describe videos with. So we could get a video from Youtube, render it, and watch it:
$ ~ wcomm youtube.com/fSWmufgTp6EQ.ps | mkmpg | xwatch
So what is different here? How is it different than using a utility like ytdl and piping it to VLC?
Remember, when we do that, we are getting a binary file. But in my imaginary example, we are getting a 'video description' in form of PostScript.
====
So anyways, as I said, I am not super expert myself. But I think this is what Kay means. As Kay says himself, PostScript is too weak for today's use! But I guess, if Web was not this 'hacky', there would be a 'WebScript' that did that!
Thanks.
Let's rewind the time machine... the original web concept was addressing a much smaller problem. That a document could reference or 'link' to another document. It came right after 'gopher' which was used as an index of indexes and had a text-based app that let you navigate the line items back and forth.
Then came Ted Nelson's idea of 'hyperlinks.' The original web mashed those two ideas together and threw in a sprinkling of SGML. There was no notion of styling of the presentation, a GUI, use of the mouse, multimedia, animation, or 'scripting.' It was just gopher with inline links, expressed in embedded markup.
Multiple other players (Netscape, Microsoft, IBM, et al) morphed and bolted on extensions without really considering the consequences. The thing the web had going for it was precisely this decentralized process. It made for rapid evolution, but it also meant there was (and continues to be) a lot of fragmentation. Anyone wanting to go back and revisit something hacky had a lot of legacy inertia to overcome.
So here we are today. It's a messy, junkyard jalopy, but it does just enough that nobody has the time or energy to go back and clean up the technical debt. And if you want to start from scratch, it has to do much better than what is there today, while offering a reason for millions of people to unlearn what they know.
As for sending 'processes' that's essentially what a VM is. You're sending a compact process as code (javascript, python, wasm, native binary) that a local runtime executes. We have app stores that manage the lifecycle, and script libraries to create abstractions and hide details. The embedded Javascript VM is as close as we have to a universal code execution environment.
Sending 'processes' around also should account for malicious actors trying to do bad things. We've all seen how that ended up.
That's not to say people shouldn't try to innovate, but at this point, it's like trying to reinvent driving or the telephone.
Wow are you from the future? Because I just had this exact same thought, that JS is just that 'process', so I read the ECMA-262 standard and I posted the new thread about something funny I found in it. In fact I said something that closely resembles what you said. It's just freaky!