By "revolution against the establishment", I take you to mean "revolution against the complexity of existing tools". Meaning, a simpler tool. You can build a tool that will do 80% of what the existing tool does, with 20% of the complexity (or maybe even 90/10). And that's great... until you need the ability to do that last 10 or 20%. Then the simple tool has trapped you.
But by then, you've got a lot of code in the new tool. So what you want is a way to do whatever part of the last 10 or 20% of power that you need for your problem. "It's just a small addition!" But there's someone else who needs a different part of the last 10 or 20%, and wants to add that part...
And so you wind up with the new tool becoming as complex as the old tool. And then, as you say, the cycle repeats.
I think that if a tool is going to be an "80% of the power at 20% of the complexity" tool, and remain that, then it has to have an escape mechanism. You've written your 100,000 lines of simple code, and you need 50 lines in a more powerful tool, well, there's a clean way to use code written in a more powerful language for those 50 lines. Then the language can remain one that just has 20% of the complexity (if those in charge of the language can maintain their vision and their stubbornness).
One nice thing about Go is the existence of cgo. Yeah, it's discouraged, and rightly so, but you have that option. The ol' "Give it to C, C will do anything".
The other is IPC. Go is so dang easy at concurrency, managing data flow, async IO, etc, that I find it really lends itself to working as a cog in a larger machine, usually distributed. Don't like solving problem X in Go? Solve it however you want and just talk to your Go process.
So you have 2 escape hatches which were much less tenable as overall approaches even 10 years ago. So hopefully Go can stay lean and mean. I think it also helps that unlike other systems languages, Go doesn't have any intent on being a catch-all language. Graphics, hard real-time, drivers? You ain't gonna reach for Go. Light scripting, data science, machine learning? Also probably not Go.
Yes, I do agree, specially because those 80% are not the same for everybody when one starts to target the language into domains it wasn't originally thought of.
But by then, you've got a lot of code in the new tool. So what you want is a way to do whatever part of the last 10 or 20% of power that you need for your problem. "It's just a small addition!" But there's someone else who needs a different part of the last 10 or 20%, and wants to add that part...
And so you wind up with the new tool becoming as complex as the old tool. And then, as you say, the cycle repeats.
I think that if a tool is going to be an "80% of the power at 20% of the complexity" tool, and remain that, then it has to have an escape mechanism. You've written your 100,000 lines of simple code, and you need 50 lines in a more powerful tool, well, there's a clean way to use code written in a more powerful language for those 50 lines. Then the language can remain one that just has 20% of the complexity (if those in charge of the language can maintain their vision and their stubbornness).