Page 1 of 2
The Future of Programming, circa 1973
Posted: Mon Dec 01, 2014 6:25 am
by Schol-R-LEA
This video is of a lecture given by Bret Victor in 2013 as a sort of 'future retrospective', an imagined speech given by victor in 1973 about the state of the art and where it would have seemed to be going at the time. He uses this as a critique of the lack of progress in the field and the growing stagnation of certain areas of development; basically, he points out that while there has been constant refinement and change, there has been very little real improvement in
how we write programs since the 1970s, and many promising avenues of research withered on the vine. His real point though is that the field has become stagnant, lacking in any real groundbreaking development since around 1980 or so, a point I have been trying to explain to people for years myself.
Re: The Future of Programming, circa 1973
Posted: Mon Dec 01, 2014 7:31 am
by alexfru
C++ is the same problem today as assembler yesterday as machine code yesteryesterday. The scale is different, but the problem is the same. And yet, C++ continues doing it all wrong and even more wrong and it isn't going to go away anytime soon.
Re: The Future of Programming, circa 1973
Posted: Mon Dec 01, 2014 8:19 am
by SoLDMG
This guy is just brilliant. I love the fact he's using that old overhead projector as a prop.
Re: The Future of Programming, circa 1973
Posted: Mon Dec 01, 2014 8:49 am
by no92
SoLDMG wrote:This guy is just brilliant. I love the fact he's using that old overhead projector as a prop.
Re: The Future of Programming, circa 1973
Posted: Mon Dec 01, 2014 3:51 pm
by Arto
Schol-R-LEA wrote:This video is of a lecture given by Bret Victor in 2013 as a sort of 'future retrospective', an imagined speech given by victor in 1973 about the state of the art and where it would have seemed to be going at the time. He uses this as a critique of the lack of progress in the field and the growing stagnation of certain areas of development; basically, he points out that while there has been constant refinement and change, there has been very little real improvement in
how we write programs since the 1970s, and many promising avenues of research withered on the vine. His real point though is that the field has become stagnant, lacking in any real groundbreaking development since around 1980 or so, a point I have been trying to explain to people for years myself.
Saw this a while back; a novel presentation concept, pretty well executed, on a worthwhile topic.
As an erstwhile Schemer myself, and someone who works with a colleague who has been programming almost exclusively in (variants of)
Lisp since the 1970s at MIT, I certainly appreciate the point.
I suspect that we'll see some advances if we ever manage to transcend the chasm from syntax to ubiquitous semantics (i.e., meaning). That is, much in the same way as we can now take lower levels of representation for granted (typed->lexical->syntactic), we aren't quite there yet with semantics. I work in that space myself (RDF databases), which makes painfully clear the limitations of current approaches.
Two related links:
Wirth's law, and
Systems Software Research is Irrelevant by Rob Pike.
Re: The Future of Programming, circa 1973
Posted: Tue Dec 02, 2014 2:49 am
by embryo
Schol-R-LEA wrote:This video
...
real point though is that the field has become stagnant, lacking in any real groundbreaking development since around 1980 or so, a point I have been trying to explain to people for years myself.
What do you think the "real groundbreaking development" is?
Re: The Future of Programming, circa 1973
Posted: Tue Dec 02, 2014 6:55 am
by Schol-R-LEA
Well, that would be difficult to say, since it's hard to speculate on something that didn't happen. Project Xanadu, if it had succeeded, would have been such a groundbreaking development; the world wide web, on the other hand, falls short of that on a technological level, despite it's massive impact on the use of computers and on the world as a whole. I have often said, somewhat facetiously, that the last really innovative developments were the invention of the spreadsheet model (1976) and the Lempel-Ziv series of compression algorithms (1977-1981), and while that's an exaggeration, it is rooted in a kernel of truth.
When you look at all the larger developments in the field since, say, 1980, most of them have been refinements or reshufflings of existing concepts, evolutionary but not revolutionary. Linux is a straightforward reimplementation of Unix; it's licensing model is far more innovative than its code. Java and C# are fairly obvious developments from C++, which itself is a combination of C and Simula with some influences from Smalltalk. The WWW, as I said, has been revolutionary for society, but the technology is far less so; as a hypertext system, it is crude, even by 1989 standards, and much of the work done on it is workarounds to get through its limitations. Shells have largely given way to GUIs, but still linger as an appendix on most systems, and while the graphical user interfaces of most operating systems have gotten faster and prettier, they are not really much more functional than they were in Smalltalk-80, and in some ways less so. And, to get back to the topic of the video, the IDEs and editors we as programmers use may be refinements over their predecessors, but they still present code (and data) in basically the same linear manner, with only minimal efforts at easing the programmer's burden through the sort of automation and innovative display alternatives Victor discussed.
As a field, we are running in place; "computing in the Red Queen's square" was how I put it in 1998, and it's still true today. I don't know how to force innovation, but somehow we need a real breakthrough, something really disruptive and new.
Re: The Future of Programming, circa 1973
Posted: Tue Dec 02, 2014 8:18 am
by Arto
Schol-R-LEA wrote:Linux is a straightforward reimplementation of Unix; it's licensing model is far more innovative than its code.
Hmm, how do you figure? If it weren't for the infamous
Unix lawsuits at a critical time in the early 1990s, Linux might never have achieved the predominance that it did, and we'd all be using some variant of *BSD today (well, those of us using a Unix, anyhow). Open source predates Linux.
Schol-R-LEA wrote:The WWW, as I said, has been revolutionary for society, but the technology is far less so; as a hypertext system, it is crude, even by 1989 standards, and much of the work done on it is workarounds to get through its limitations.
Ah, but the billion-dollar question is, did the World Wide Web succeed
despite its simplicity and crudeness, or precisely
because of it?
As a fellow Schemer, I'm sure you're familiar with Richard Gabriel's classic
worse-is-better
line of argument. There may indeed be something to the notion that the "worse" approach yields evolutionary advantages. Certainly we see plenty of examples to suggest this in the software field, indeed ranging from C and Unix to the World Wide Web.
Re: The Future of Programming, circa 1973
Posted: Tue Dec 02, 2014 8:43 am
by Espanish
I enjoyed watching that presentation, and the ending was inspiring.
However, I disagree with how he seems to address only us programmers, and less so hardware engineers. I disagree, because I think advancement doesn't lie solely on the shoulders of programmers, and that software and hardware are to evolve together. This could be why even now, in OS dev, C is still the predominant language instead of something much higher level.
Perhaps we got to this point willingly, so that software can focus on portability while hardware can focus on speed, with neither having to evolve. ("If it ain't broke.")
Re: The Future of Programming, circa 1973
Posted: Tue Dec 02, 2014 8:55 am
by Antti
Espanish wrote:I think advancement doesn't lie solely on the shoulders of programmers, and that software and hardware are to evolve together.
Modern hardware is ridiculously fast but modern software makes it seem slow. That is the problem.
Re: The Future of Programming, circa 1973
Posted: Tue Dec 02, 2014 9:34 am
by Espanish
Antti wrote:Modern hardware is ridiculously fast but modern software makes it seem slow. That is the problem.
The way I see it, modern software is complex in what it needs to do.
So if it makes the hardware seem slow, that's because the hardware
is slow.
Maybe I am misunderstanding you. Are you talking about software bloat, something which is strictly a software-only problem?
Re: The Future of Programming, circa 1973
Posted: Tue Dec 02, 2014 9:51 am
by Arto
Espanish wrote:Are you talking about software bloat, something which is strictly a software-only problem?
Wirth's law:
Software is getting slower more rapidly than hardware becomes faster.
Sometimes colloquially paraphrased as:
What Intel gives, Microsoft takes away.
Re: The Future of Programming, circa 1973
Posted: Tue Dec 02, 2014 9:57 am
by embryo
Schol-R-LEA wrote:Well, that would be difficult to say, since it's hard to speculate on something that didn't happen.
I suppose it is possible to predict the answer. One button "Do it all". It still doesn't work? We lack any real groundbreaking development!!!
But now seriously. Can you ever imagine 20 years ago an iphone? There was the Newton device and it even really kept your notes and was able to show them to you. But iphone?! It's like something called supercomputer those times. And not only hardware, but also UI ideas were very far from it. And the phone itself - what was the cell phone cost those times?
Or you can compare this forum with those text-only use-net conferences. How many actions were required to post a message? Today it's just "login","read","quote","edit","post". Two productive and three helper actions. How much time is required to execute the helper actions? One second for "quote", another second for "post" and no time for "login" if you use "remember me" option. Then it is worth to compare formatting, embedded images, attachments and many more forum features. Of course, we can say "human is a straightforward reimplementation of ape" and to argue that it is still possible to use old style conferences. But for some reason people prefer modern forum software.
Schol-R-LEA wrote:I have often said, somewhat facetiously, that the last really innovative developments were the invention of the spreadsheet model (1976) and the Lempel-Ziv series of compression algorithms (1977-1981), and while that's an exaggeration, it is rooted in a kernel of truth.
I very rarely use spreadsheets. So I can't say it was "really innovative development". It means there are opinions and they are different. Somebody likes zip algorithm, but somebody else likes iphone. Your excitement was in the past and now it just burned away. But other people still have a fire within and see the world much brighter.
Schol-R-LEA wrote:I don't know how to force innovation, but somehow we need a real breakthrough, something really disruptive and new.
May be we just need a way to excite the burned people again? The innovations are here, the disruptive changes are happening every second, but sometime it is hard to notice them, especially if your eyes look back to the burned past.
One secret for a really good innovation - just take something bad and make it much better. In the end you will like the result
Re: The Future of Programming, circa 1973
Posted: Tue Dec 02, 2014 10:02 am
by embryo
Espanish wrote:The way I see it, modern software is complex in what it needs to do.
So if it makes the hardware seem slow, that's because the hardware is slow.
Does your proposal mean to trade software developer time for better hardware? But what about hardware developer time? Who should pay the bigger part?
So, it is human being that is really slow.
Re: The Future of Programming, circa 1973
Posted: Tue Dec 02, 2014 10:33 am
by no92
embryo wrote:So, it is human being that is really slow.
That's only true in some cases. If everyone works by himself, things tend to be very slow. In that scenario there would be different projects doing basically the same. If everyone programming something (e.g. operating systems) would start to work together with all the others and form one single big project, everything would be super fast. Linux is being developed relatively fast and can deal with most hardware that is on the market right now, but some random hobby OS can't.
The only thing stopping this is Brooks' law, which can be bypassed by assigning jobs.