Is x86_64 Approaching an End?

2020-12-06 22:48:54 UTC 

Had someone come to me a decade or so ago and told me:

"The industry is moving away from these large 64-bit processors."

I'd likely laugh them out of the room. Now, I'd probably be the first one in the room to agree. This isn't a new idea, nor a light bulb moment, but probably an interesting mark in history.

Several years ago I was thinking what "128-bit" processors would look like, how would they work? I thought it was almost a guarantee since we already had 8, 16, 32, and 64 bit processors. Surely we would need even greater data widths in the years to come?

For the most part I've used "large" desktop, laptop, and server processors my life. Processors that fit into a socket (ZIF/LGA), like the first one I put my hands on. Times have certainly changed from the Pentium 4 days around 20 years ago.

When the first Android phone came out, the HTC Dream, I could sense that ARM processors were gaining wider spread adoption for a reason. The Dream certainly wasn't the first device to use an ARM processor. My first true phone, the "HTC TyTN II" ran a super fast 400MHz ARM 11 processor, and that was likely single core too. Obviously these phones are lesser known due to the unique keyboard components, lack of "all screen" designs, and largely unknown or outdated Operating Systems, they still followed the ARM processor trend.

Truth be told, I learned the "art" of low-level assembly with MIPS. While ARM obviously blows MIPS out in terms of market share now, MIPS is probably the last "true" reduced instruction set computer design around (if barely). Depending on who you talk to, ARM seems to have moved on from its true RISC roots, towards a middle ground in between RISC and CISC.

This article isn't a "history of ARM", rather it is an outlook of CPUs I utilize on a day to day basis. The CPUs where I spend most of my time developing for and gaming on, are they still going to exist? Why is this "time" in potential history important? It is the reality where we could all end up going back to the terminal/mainframe framework of yesteryear—with our terminals powered by ARM processors? I'm becoming more confident as the years roll by that now, at 1607293310 (unix epoch), it's only a matter of time before this becomes a reality.

These large and clunky x86_64 processors from Intel and AMD will probably have some narrow space carved out for them in the future. Servers will probably still use them, where "all the other stuff" that can't be done on specialist add-in cards like DPUs will be handled. Even then mainstream servers will likely run ARM too for power savings, and because companies wish to avoid technical debt on a potentially more and more obscure architecture. It's also completely possible we as a world will move on to RISC-V (if there's any money to be made).

Most companies today don't really care what runs their product. If the device has a web browser (and because of "trendy" reasons it's likely Chromium), their product will work and couldn't be any happier. More and more products are utilizing web browsers/engines for various reasons, usually for portability and ease of programming. Even game companies don't care as long as the leading game engines (Unreal/Unity) work on said device, which is a given since both engines support "HTML5" export options. Even the terminology is going this way too. The term "application" used to indicate a standalone and usually installed piece of software on a computer. Now the term is so tightly ingrained with web development, you wouldn't be able to go a day without hearing or seeing something about a "web app" (whatever that means). 2020 is the year where "compiling" CSS is a thing, there's a package manager for almost every language (even if you wouldn't want need to use it), and there are frameworks for frameworks (so you might end up in DLL Dependency Hell).

Speaking about terminology, the phrase "cloud" is attached to so many solutions and products advertised, it's bewildering why it's used when so many don't understand what the term means. And because one term wasn't enough, adding in "fog" and "edge" seem to have filled up the required terminology bucket. All of these indicate a move back to mainframes and terminals, where everything is done somewhere else. I will likely be exploring this idea further in another article, where "you own nothing" becomes a reality.

It's not only the terminology that has changed, it's also how the processors are being made and designed. Intel recently announced a processor lineup awfully reminiscent of ARM's "big.LITTLE". Packing so many and such complex instructions into a processor isn't getting any easier. While at the time of writing AMD has taken the lead of high performance desktop CPUs, it appears Intel is performing contingency planning, moving to more "mobile-like" designs. Sure there's the likely informal duopoly burning bright for AMD and Intel, but once desktops loose even more market share, what will x86_64 look like? Will I even have a desktop at that point? Will I even need one?

This work is licensed under CC BY-SA 4.0

Your Comments

 Welcome, please sign in through Steam to post a comment!

Comments

There are no comments to be loaded for this article.