Five years ago the era of stagnation ended. For roughly ten years Intel had been the king of the hill with their Core processors. Their main competitor in the processor space, AMD, struggled for a variety of reasons: they had invested into a failed architecture with the aim to bring more processor cores to the consumer. But their processor didn’t have enough of traditional punch and AMD didn’t get the software level of support to get the most out of the Bulldozer processor line. This support would’ve let them have a more even playing field against Intel.
The eventual end result of this failure was Intel that neglected progress: in consumer space we were locked to a roof of 4 cores 8 threads, and we only got ~5% yearly improvements. Intel no longer thought big and focused more in keeping their competitors down while taking in as much money from consumers as they could, rather than really improving the products they provide.
This all changed with Ryzen. AMD managed to create a winning architecture that provided the Bulldozer promise of more cores, but these new cores also had the punch to be good enough against Intel’s offerings. And, most importantly, AMD could offer these new processors for less money. Eventually Intel was forced to bring their prices down and offer more.
The era of fast progress
Computing is still a rather new industry. The home computer market has been around for only about 50 years. Altair was developed in 1974 and was a success in hobbyist circles. Eight years later the Commodore 64 was released, and is the most successful single computer model ever that sold millions! Estimates range between 12.5 and 17 million units.
The 80’s was a good time as C64 could be manufactured until the closure of Commodore in 1994. And people still bought the computer in the mid-90’s despite it being heavily outdated. Strong community was a more important buying factor than having the most performance. But, with Commodore gone as a company, the shock also killed the growth of the community and there were only limited commercial opportunities to keep supporting the system. The era of C64 ended.
In the meanwhile the industry at large kept on bringing the progress, and the biggest community formed around the IBM PC. The success of the PC is based on replaceable parts and standardization: in theory all the parts are interchangeable and you should be able to mix’n’match stuff as you please.
However there are still generational changes that break this interchangeability with the past. Motherboards have also become dependent on the processor manufacturer: after the turn of the new millenia we have not seen multiple manufacturers producing processors for a single processor socket.
The industry is mostly open with common standards and there is a lot of interoperability, but fails to have it throughout.
There are a lot of problems with the current way we do business, how the economy works, and how it all relates to the political landscape. The economic freedom and globalization are used as a means of exploitation, and in many ways we still have not let go of the age of colonialism. This has never been a sustainable model, but it has been profitable and has been allowed to be a source of power.
The value of the business lacks a lot of empathy and is based on hard values. We all have heard the mantra of “our responsibility to the shareholders”. This is the excuse used for the maximizing of profits over “doing the right thing”. As the main focus of companies is to be based on the idea of eternal growth (which is impossible) and thus making the most profit possible, no matter what the company is they will eventually fall into playing the game in unfair manner.
Intel has a long history of doing anti-competitive evils, such as paying money to other companies to make use of their processors and not use AMD even though AMD has had the better product. And while Intel has been punished of this I would argue they haven’t been punished bad enough of it. They had the money and thus power to avoid getting treated fairly. The silver lining in all of this is that eventually the laws and regulations have caught up. And information is also much more readily available on the modern Internet. Individuals can work as watchdogs.
While the ability for the system to correct itself is nice, we still have a problem: the values of companies remain to be based on the capitalist dream of profits. No matter what we do the core game in the background remains the same and we do not truly get results that are in the true interest of people, but of those who want to gain more money and power.
So what kind of practical issues this mentality gives us?
- Motherboard sockets that change every two years, forcing to always buy a new motherboard (Intel does this)
- Continuous stream of new proprietary standards (for example NVidia’s CUDA and DLSS)
- Software that drops support to older computer hardware sooner than necessary (Windows 11)
- Cheap short life computers for those who can’t afford a better one
- NVidia selling their production to crypto miners and thus artificially increasing the consumer prices
- Production of products nobody wants or needs (example: the 99 € Intel Windows 8 tablets)
And if we’re honest there are tons more issues, but the heart of all of this is the wasteful use of resources. You often hear how the capitalist system is efficient, but in reality it is often rather wasteful. The only thing it is truly efficient at is making money, exploiting the nature, and sharing power inequally. There are way too many people who have no choice but to play the game. Work a job that is bad for the environment.
What do we really want?
I have a dream. One day I want to be operating a 50 years old computer. And this computer would be my daily driver. The computer should be desirable enough that a young person would want to have and use it after I die.
Sure, it might not be all original parts. Sure, it probably has had some upgrades. But for the most part it should be a computer in the same case with a lot of original parts still doing their job.
This kind of longevity would require a lot of planning. It would be a total contrast to today where we design products for mass production, and the products keep changing all the time. The current model of production is optimized to work in a world of marketing where we make people desire the new, and thus abandon the old. This has some consequences:
- We optimize production to create a lot of products in a relatively short time frame.
- We focus more on providing new features over improvement of the existing features.
- Products are not designed for maximum longevity, but rather for “reasonable” longevity of less than 10 years.
The waste in this process is the need to produce a lot of new stuff in a short period of time. Such as launching a new product: companies use a lot of money to market a product, and the success of a company is measured in their ability to deliver enough products to make customers happy. Customers that were attained using marketing, even if they didn’t really have a need for the product in the first place.
We also waste a lot of human resources on the marketing side, and with the new features. Time that could be spent being there for others, serving the true human needs that each of us have, are spent into thinking new slogans and some new idea that could bring more money. This is purely work that is spent into feeding consumerism. Do we really want it this way?
So instead of using our time to continuous loop of creating desires what should we do instead?
Things that last
It is hard to shift a mental model to the idea of things that remain usable for as long as possible. We’re so used to things breaking and outdating often. When it comes to computers sustainable production in the computing industry would require a very, very big change.
First of all, we do should keep on improving things as we do now. Especially for the big customers who need tons of servers and all the computing power imaginable, such as science. Also, it should be possible for the few who want the best and live on the edge to have their latest and greatest.
But for the vast majority we would instead like to standardize a common system. So instead of being limited by competition we would see companies working in perfect co-operation and to agree on truly modular standards which are planned to be used for decades.
I think this is possible especially now: we’ve reached a point where even the lower end of computers available at a store packs enough power for the majority of daily use. Sound technology has been great for decades, graphics are getting very close to the ultra realism we’ve desired for so long, and display technology has become quite incredible and can now also provide the extra bit of realism that is becoming possible with modern graphics technology. The current video compression standards are also very good.
So we could say “this is good enough a baseline level” and create a system where we have a great minimum expectation level. Something that is truly enough for almost everyone for most of what they ever do with a computer. However specs are only one part of the equation. The hard part would be to design the thing to really last.
What goals does it need to meet?
- Modularity, which allows repairs and upgrades
- Compatibility, which allows upgrades over long time span
- Expansion, which allows upgrades
- Redundancy, which allows system to continue working even if something breaks and allows notifying of a need of repair
- Clarity, which allows people to understand what has broken and what needs to be done to fix it
- Software that continues to be updated
- Accessible by default, everything that can be done visually should also be doable without a screen
- Optimally no moving parts
- Recycling, which allows material re-use once a part has broken beyond usability
These are not new ideas, but they have certainly not been driven to the maximum over the years. For example, if there is a failure in an SSD drive it is way too common that the SSD becomes entirely unusable. There is no redundancy to allow it to continue functioning, and this error state also is very unlikely to be notified of in a clear manner. In a more ideal world there would be a hardware level user interface that would be accessible with or without operating system being loaded, and it would be usable even for a blind person.
A computer should not become unusable regardless of whether something breaks in the computer, or something fails in the human who uses it. This is how things should be designed to last.
No company would do this! This is expensive!
Yes. No current company would do this, because they live in a world of competition, and not in a world of true co-operation. The current companies also waste a lot of their wealth into things that we don’t really need, because they have to remain competitive, and they have to keep growing into monopolies. Each step of the way they optimize their existence to have as few people working in the company as possible, and these people must use as much of their time for the company as possible.
All of that is not a sustainable model. We can’t keep having people working most of their time in companies to keep the wasteful cycle going on. The cycle of generating ever more wealth for as few as possible. The only reason this model remains is the power of money already generated by the few so that they can keep politians and countries favorable to how they function.
In the world that is to come, a world we hopefully get without the deadline of first collapsing the human civilization to a brink of desctruction, we really don’t need to keep producing so much stuff. This means that the stuff that we do have to create can be given a lot of time and care, and are thus expensive in monetary value. But that stuff will also keep functioning and remain usable unlike the stuff that the current world system creates.
In the past buildings were truly built to last. Japanese carpenters knew how to build and maintain buildings that last for thousands of years. Wooden buildings. In the modern times Japanese carpenters are in a crisis: there are ever fewer of them, and they have ever less resources to build buildings with a material they know to last. Way too many of old growth forests have been cut down. It is rare for anyone to become a carpenter in the old way: people rather are pushed through the school and university system, which while great does lack the practical side of thinking via experience, and a specific mindset into creation.
A hundred years ago clothes were so well made that people can still inherit clothes from their great great grandparents. And these clothes are still in a perfectly good condition to be worn!
I would like to imagine a future combined computer store and warehouse which has stuff from multiple decades and you can go for an adventure to find all kinds of neat stuff. And you could still make use of any of the stuff!
We don’t need to waste money and work effort into a continous loop of consumption if we let us do things well, and let the nature grow and be old.