Why do we seem to be reinventing the wheel in technology all the time?

Technological advancement often looks like reinventing the wheel, as Alan Kay mentions in an interview. I was thinking about why it's like that (particularly in the software industry) and whether it has to be like that.

I don't know what the next major step will be in the software development field, either for languages or platforms in general. It's not great to reinvent the wheel all the time, but it's worth noting that it's a slightly different wheel every time, and overall we're inching forward if we look at the user experience and what software allows us to do.

We're going back and forth on hardware constraints, which is a big influence. CPU power, screen sizes etc. were growing for a long time, but then it all got constrained in mobile devices, with the addition of another constraint on battery life (and the constraints are becoming more relaxed again...). If Google Glass catches on, it'll be yet another similar cycle.

We're also trying to push in different directions and explore the limits of different approaches, and sometimes the enthusiasm doesn't match up with reality (but it may still be required to move forward). To give a small example, at one point people thought that everything could be done with VM languages, but now the pendulum seems to be swinging back the other way, and perhaps people will try to see instead how much easier programming can be made in compiled languages like C++. There is similar tension between web and native mobile applications, or between relational databases and the no-schema databases. Ultimately, different camps borrow ideas from each other, and we hopefully end up with something that's better than either of the old ways.

Fundamentally, the level of abstraction in programming languages has to increase to allow us to produce more sophisticated software. We keep bumping into the limits of what's manageable/possible with the existing abstractions (as Alan Kay also points out; some of his thoughts quoted here).

That means either implementing different abstractions in hardware (which is happening), or changing the software/programming languages to provide better abstractions (which is also happening, even with C++). It might seem like slow evolutionary change, but actually this pace of change is unprecedented in human history, and the computer industry is  changing everything around us. Perhaps progress just can't happen any faster, and maybe it requires gradual refinement, and sometimes taking a step back before going further forward.

It reminds of that book about disruptive innovation (The Innovator's Dilemma?). Its premise of disruptive startups is flawed but the basic idea seems correct. Innovation is often about making something that has (a) a new capability and (b) a lot of limitations compared to existing products. So it allows us to get onto a new trajectory that ultimately takes us further, but the start of the trajectory may be behind where we currently are.

If I could wish for anything from a software developer's point of view, it's that we didn't have to have such dizzying levels of indirection. I'm thinking about the size of a typical call stack in Ruby on Rails, as well as everything that happens underneath before we get to electrons and transistors. This layer cake approach to abstraction creates enormous complexity and inefficiency (as well as a lot of leaks).