PC processors entered the Gigahertz era today in the year 2000 with AMD's Athlon (tomshardware.com)
145 points by LorenDB 8 hours ago
xnx 7 hours ago
The Megahertz Wars were an exciting time. Going from 75 MHz to 200 MHz meant that everything (CPU limited) ran 2x as fast (or better with architectural improvements).
Nothing since has packed nearly the impact with the exception of going from spinning disks to SSDs.
dlcarrier 6 hours ago
In my experience, SSDs had a bigger impact. Thanks to Wirth's Law (https://en.wikipedia.org/wiki/Wirth%27s_law) the steady across-the-board increase in processing power didn't equate to programs running much faster, e.g. Discord running on a modern computer isn't any more responsive, if not less responsive than an ICQ client was running on a computer 25 years ago.
SSDs provided a huge bump in performance to each individual computer, but trickled their way into market saturation over a generation or two of computers, so you'd be effectively running the same software but in a much more responsive environment.
majormajor 6 hours ago
Anytime you upgraded from a 4 year old computer to a new one back then - from 16Mhz to 90Mhz, or 75Mhz to 333Mhz, or 333Mhz to 1Ghz, or whatever - it was immediate, it was visceral.
SSDs booted faster and launched programs faster and were a very nice change, but they weren't that same sort of night-and-day 80s/90s era change.
The software, in those days, was similarly making much bigger leaps every few years. 256 colors to millions, resolution, capabilities (real time spellcheck! a miracle at the time.) A chat app isn't a great comparison. Games are the most extreme example - Sim City to Sim City 2000; Doom to Quake; Unreal Tournament to Battlefield 1942 - but consider also a 1995 web browser vs a 1999 one.
y1n0 3 hours ago
dlcarrier 5 hours ago
nucleardog 5 hours ago
dep_b 2 hours ago
IshKebab 16 minutes ago
gavinsyancey 6 hours ago
> Discord running on a modern computer isn't any more responsive, if not less responsive than an ICQ client was running on a computer 25 years ago.
The only thing more impressive that hardware engineers' delivering continuous massive performance improvements for the past several decades is software engineers' ability to completely erase that with more and more bloated programs to do essentially the same thing.
dlcarrier 5 hours ago
vachina 6 hours ago
> Discord running on a modern computer isn't any more responsive, if not less responsive than an ICQ client was running on a computer 25 years ago.
I feel this. Humanity has peaked.
accrual 4 hours ago
beastman82 5 hours ago
Agree 100%. the compute was always bottlenecked by insanely high i/o latency. SSDs opened up fast computers like no processor ever did.
lich_king 3 hours ago
Eh. In the 1980s and 1990s, the capabilities of the software you could run on your new computer were changing dramatically every two years or so. Completely new types of computer games and productivity software, vastly improved audio and video, more and more real-time functionality.
Nowadays, you really don't get these magical moments when you upgrade, not on the device itself. The upgrade from Windows 10 to Windows 11 was basically just more ads. Games released today look about as good as games released 5-10 years ago. The music-making or photo-editing program you installed back then is still good. Your email works the same as before. In fact, I'm not sure I have a single program on my desktop that feels more capable or more responsive than it did in 2016.
There's some magic with AI, but that's all in the cloud.
steve1977 5 hours ago
I mean, HDD were much faster than floppy disks. Which were in turn much faster than tape cassettes. And so on...
idiotsecant 6 hours ago
This is silly. That's like saying that machines haven't gotten any better because a helicopter doesn't eat any less hay than a horse did.
anthk 32 minutes ago
dlcarrier 5 hours ago
embedding-shape 7 hours ago
> Nothing since has packed nearly the impact with the exception of going from spinning disks to SSDs.
"Bananas" core-counts gave me the same experience. Some year ago I moved to Ryzen Threadripper and experienced similar "Wow, compiling this project is now 4x faster" or "processing this TBs of data is now 8x faster", but of course it's very specific to specific workloads where concurrency and parallism is thought of from the ground up, not a general 2x speed up in everything.
st_goliath 7 hours ago
> The Megahertz Wars were an exciting time.
About a week ago, completely out of the blue, YouTube recommended this old gem to me: https://www.youtube.com/watch?v=z0jQZxH7NgM
A Pentium 4, overclocked to 5GHz with liquid nitrogen cooling.
Watching this was such an amazing throwback. I remember clearly the last time I saw it, which was when an excited friend showed it to me on a PC at our schools library. A year or so before YouTube even existed.
By 2005, my Pentium 4 Prescott at home had some 3.6GHz without overclocking, 4GHz models for the consumer market were already announced (but plagued by delays), but surely 10GHz was "just a few more years away".
accrual 4 hours ago
IIRC, part of the GHz problem is that very long pipelines like that of the Pentium 4 tend to show increasing benefits at higher clocks. If you can keep the pipeline full then the system reaps the benefits. Sort of like a drag racer - goes very fast in a straight line but terrible on corners.
But with longer pipelines comes larger penalties when the pipeline needs to be flushed, so the P4 eventually hit a wall and Intel returned to the late Pentium 3 Tualatin core, refining it into the Pentium M which later evolved into the first Core CPUs.
fnord77 6 hours ago
only just last year did someone goose a PC cpu to 9.13ghz
https://www.tomshardware.com/pc-components/cpus/core-i9-1490...
OldSchool 2 hours ago
When Alder Lake finally made a sizable jump, I looked at decades of old tests I'd done along the way with CPUs and tried to bridge them together reasonably.
Between IPC (~50 to 100-fold improvement) and clock speed increases (1000-fold alone), I estimated that single-thread performance has increased on the order of 50,000x - 100,000x since the 4.77 MHz 8088.
In human terms this is like one minute compared to one month!
rr808 6 hours ago
I still remember my first CPU with a heatsink. It seemed like a temporary dumb hack.
iknowstuff 3 hours ago
Well it kinda was! seeing how power efficient iPhone chips are despite hovering the top of single core benchmarks.
oso2k 4 hours ago
I had the same inclination back in the 90s when I upgraded my Cyrix 486 SLC2 50MHz without a heat sink (which seems like a no-no in retrospect) to Cyrix MediaGX 133MHz. The stocker fan was immediately noticeable. I thought I had done something wrong.
myself248 4 hours ago
HPsquared 7 hours ago
SSDs were such a revolution though, and a really rewarding upgrade. I'd fit SSDs to friend and family computers as an upgrade.
micv 7 hours ago
Getting my first SSD was absolutely the best computer upgrade I've ever bought. I didn't even realise how annoying load times were because I was so used to them and coming from C64s and Amigas even spinning rust seemed fairly quick.
It took a long time before I felt a need to improve my PC's performance again after that.
coffeebeqn 7 hours ago
sigmoid10 7 hours ago
I once had a decade old Thinkpad that suddenly became my new work laptop once more thanks to an SSD. It's a true shame they simply don't make them like this anymore.
patwolf 3 hours ago
I owe much of my career to an SSD. I had a work laptop that I upgraded myself with an 80GB Intel SSD, which was pretty exotic at the time. It was so fast at grepping through code that I could answer colleagues’ questions about the code in nearly real time. It was like having a superpower.
dcminter 7 hours ago
Just before I installed an SSD was the last time I owned a computer that felt slow.
pdpi 4 hours ago
I think the single biggest jump I ever experienced was my first dedicated GPU — a GeForce 2 MX if I'm not mistaken.
nunez 4 hours ago
Agreed. That was the next big boost! I installed my first SSD in this HP workstation-grade laptop that we got "for free" from college. It was like getting a brand new computer! In fact, I ended up giving that computer to my sister who ran it into the ground.
I didn't feel any huge speed boosts like that until the M1 MacBook in 2020.
geon 7 hours ago
GPUs for 3d graphics were a game changer.
I can see why you wouldn’t consider it as impactful if you weren’t into gaming at the time.
kwanbix 3 hours ago
My first pentium was clocked at 60Mhz.
iwontberude 3 hours ago
I remember our school getting new computers to replace the 233Mhz G3 iMac computer lab during the Megahertz Wars and the vice principal announcing the purchase of new "screaming fast" 600 Mhz Dell Optiplex GX100. The nice thing is that the G3 iMacs then got pushed out to the classrooms, but it was sad to see Apple lose the spot in the lab. I miss the wonder of playing Pangea Software games for the first time like Bugdom and Nanosaur.
jmyeet 6 hours ago
That wasn't how it worked.
Up until the 486, the clock speed and bus speed were basically the same and topped out at about 33MHz (IIRC). The 486 started the thing of making the CPU speed a multiple of the bus speed eg 486dx2/66 (33MHz CPU, 66MHz bus), 486dx4/100 (25MHz CPU, 100MHz bus). And that's continued to this day (kind of).
But the point is the CPU became a lot faster than the IO speed, including memory. So these "overdrive" CPUs were faster but not 2-4x faster.
Also, in terms of impact, yeah there was a massive incrase in performance through the 1990s but let's not forget the first consumer GPUs, namely 3dfx Voodoo and later NVidia and ATI. Oh, Matrox Millenium anyone?
It's actually kind of wild that NVidia is now a trillion dollar company. It listed in 1998 for $12/share and adjusted for splits, Google is telling me it's ~3700x now.
giantrobot 41 minutes ago
You got your multipliers backwards with the 486dx. The multipliers was on the CPU core rather than the bus. A dx2 was twice the memory bus speed. The dx4 was (confusingly) three times the bus speed. So a 486dx4/100 was a 33MHz bus with a 100MHz core.
varispeed 6 hours ago
I don't know. I felt this way when switching from Intel laptop to Apple M1. I am still using it today and I prefer it over desktop PC.
Aurornis 18 minutes ago
I also went from an Intel MacBook Pro to an M1 and appreciated it, but that leap was exaggerated by how bad the last few generations of Intel MacBook Pros were.
The Apple Silicon chassis was allowed to finally house an appropriate cooling solution, too. They are much quieter than the same Intel laptops when dissipating the same power levels.
embedding-shape 5 hours ago
Have you ever used proper desktop computers? I suppose such a move would feel significant if you've mostly been using laptops.
philistine 4 hours ago
nunez 4 hours ago
What a time to be a kid then.
We had a hand-me-down DEC x86 desktop at home with a Pentium II running at 233 MHz until I want to say 2002? This was around the time I learned how to build a PC since doing that was cheaper than buying one and no-one in my family had the money for that!
I saved whatever money I could to buy a 128MB stick of RAM from Staples (maybe it was 256MB?), a few other things from TigerDirect/Newegg and _this processor_. With some help from my uncle and a guide I printed from somewhere whose website started with '3D' (it was quite popular back then; I don't think it exists anymore), I got it done.
Going from 233 MHz to this was like going from walking to flying in a jet! Everything was SO MUCH F**ING FASTER. Windows XP _flew_. (The DEC barely made the minimum requirements for it, and boy did I feel it.) Trying to install Longhorn on it a year or two later brought me back into walking again, though. :D
lovehashbrowns 3 hours ago
My first pc I built was with an AMD athlon 64 4000+ and a GeForce 6600GT. Going to that from an e-machines piece of junk was INSANE. It’s so hard to come up with a similar experience shift nowadays. Even websites seemed to load instantly with the same DSL connection. Everything felt soooooo good.
crims0n 2 hours ago
You reminded me that building use to be considerably cheaper than buying.
I remember my teen years, doing odd jobs to get some cash, buying a part at a time until the build was complete. Worrying that if you didn't scrap together enough parts soon there may be an architecture change. Finally getting it all together and the feeling of pure bliss installing the OS, troubleshooting drivers, installing this or that. Good times.
general_reveal 3 hours ago
GeForce 3.
Sharlin 7 hours ago
The i486DX 33MHz was introduced in May 1990. A 30x increase, or about five doublings, in clock speeds over ten years. That's of course not the whole truth; the Athlon could do much more in one cycle than the 486. In any case, in 2010 we clearly did not have 30GHz processors – by then, the era of exponentially rising clock speeds was very decidedly over. I bought an original quadcore i7 in 2009 and used it for the next fifteen years. In that time, roughly one doubling in the number of cores and one doubling in clock speeds occurred.
adrian_b 7 hours ago
"The era of exponentially rising clock speeds" was already over in 2003, when the 130-nm Pentium 4 reached 3.2GHz.
All the later CMOS fabrication processes, starting with the 90-nm process (in 2004), have provided only very small improvements in the clock frequency, so that now, 23 years later after 2003, the desktop CPUs have not reached a double clock frequency yet.
In the history of computers, the decade with the highest rate of clock frequency increase has been 1993 to 2003, during which the clock frequency has increased from 67 MHz in 1993 in the first Pentium, up to 3.2 GHz in the last Northwood Pentium 4. So the clock frequency had increased almost 50 times during that decade.
For comparison, in the previous decade, 1983 to 1993, the clock frequency in mass-produced CPUs had increased only around 5 times, i.e. at a rate about 10 times slower than in the next decade.
hedora 6 hours ago
Sort of: The Pentium 4 was a strange chip. It had way too many pipeline steps, and was basically just chasing high clock speed marketing numbers instead of performance. In other words, it hit "3.2GHz" by cheating.
I'd argue you'd need to use AMD's Athlon XP or 64 bit processors, or either Pentium 3 / Core 2 Duo to figure out when clock speeds stopped increasing.
crest 2 hours ago
layer8 7 hours ago
On the plus side, the 486DX-33 didn’t require active cooling. The second half of the 1990s was when home computing started to become noisy, and the art of trying to build silent PCs began.
johnflan 3 hours ago
The CPU didn’t but the chonker of a fan in the PSU sure made up for it
giantrobot 37 minutes ago
bee_rider 6 hours ago
It is true that we haven’t seen single core clock speeds increasing as fast, for a long while now. And I think everyone agrees that some nebulously defined “rate of computing progress” has slowed down.
But, we can be slightly less pessimistic if we’re more specific. Already by the early 90’s, a lot of the clock speed increase came from strategies like pipelines, superscalar instructions, branch prediction. Instruction level parallelism. Then in 200X we started using additional parallelism strategies like multicore and SMT.
It isn’t a meaningless distinction. There’s a real difference between parallelism that the compiler and hardware can usually figure out, and parallelism that the programmer usually has to expose.
But there’s some artificiality to it. We’re talking about the ability of parallel hardware to provide the illusion of sequential execution. And we know that if we want full “single threaded” performance, we have to think about the instruction level parallelism. It’s just implicit rather than explicit like thread-level parallelism. And the explicit parallelism is right there in any modern compiler.
If the syntax of C was slightly different, to the point where it could automatically add OpenMP pragmas to all it’s for loops, we’d have 30GHz processors by now, haha.
hedora 6 hours ago
Clock speed increases definitely slowed down, but now that software can use parallelism better, we're seeing big wins again. Current desktop/laptop packages are doing 100 trillion operations per second. The article's processor could do one floating point op per cycle, or 1B ops. So, we've seen a 100,000x speedup in the last 25 years. That's a doubling every ~ 1.5 years since 2000.
It's not quite apples-to-apples, of course, due to floating point precision decreasing since then, vectorization, etc, but it's not like progress stopped in 2000!
lysace 5 hours ago
Web browsing is still largely single/few-threaded in practice, afaik. (Right?)
hedora 7 hours ago
The Athlon XP was the bigger milestone, as I remember it.
They were both "seventh generation" according to their marketing, but you could get an entire GHz+ Athlon XP machine for much less than half the $990 tray price from the article.
I distinctly remember the day work bought a 5 or 6 node cluster for $2000. (A local computer shop gave us a bulk discount and assembled it for them, so sadly, I didn't poke around inside the boxes much.)
We had a Solaris workstation that retailed for $10K in the same office. Its per-core speed was comparable to one Athlon machine, so the cluster ran circles around it for our workload.
Intel was completely missing in action at that point, despite being the market leader. They were about to release the Pentium 4, and didn't put anything decent out from then to the Core 2 Duo. (The Pentium 4 had high clock rates, but low instructions per cycle, so it didn't really matter. Then AMD beat Intel to market with 64 bit support.)
I suspect history is in the process of repeating itself. My $550 AMD box happily runs Qwen 3.5 (32B parameters). An nvidia board that can run that costs > 4x as much.
ahartmetz 4 hours ago
The article links to a list of "The five greatest AMD CPUs". I've owned two and a half of these! Athlon XP 1800+, Ryzen 7 1700 (I had the 1800X which was just a higher bin of the same chip), and Ryzen 9 3950X.
That same article also says that extending x86 to 64 bits "wasn't hard", which I'm not so sure about. There are plenty of mistakes AMD could have made and cleanups they could have missed, but they handled it all quite well AFAICS.
iknowstuff 3 hours ago
Next up AI 395+. I love this thing. Sips power!
ahartmetz 3 hours ago
random3 5 hours ago
Fun times. Coolers, paste, fans, supply watts, dip switches and jumpers. Quake, Voodoo 3dfx vs NVidia GForce. This is where it all started, kids.
I was in high school and was running a "computer games club" (~ Internet cafe for games and kids) since 1998 when we got a place, renovated it ourselves, got custom built furniture (cheap narrow desks) and initially 6 computers - AMDs at 300Mhz. By 2000 we broke a wall in the adjacent space and had ~15, cable + satellite internet for downloads and whatever video cards we could buy or scrap. It was wild.
Razengan 2 hours ago
> This is where it all started, kids.
Nah.. Cassettes, computers-in-a-keyboard, booting straight into BASIC.. THIS is where it all started, grandkids.
random3 39 minutes ago
Hahaha! True. I was pointing out the beginnings of modern GPUs and NVidia, but yes heard the cassette screeching before the modem screeching indeed.
nikanj 4 hours ago
Finding high school kids with a similar "tech" background today seems really hard. Tech users, sure, chronic phone / game addicts are everywhere, but that tweaker spirit is rare
ehnto 5 hours ago
I bought a whole bunch of parts with my first Athlon. I think I bought a Soundblaster, and a Radeon GFX card if I am remembering the timeline right. The soundblaster came with a demo of a Lara Croft game that used the then incredible spatial audio processing to great effect. The industry promptly forgot about that technology, and to this day game audio rarely matches the potential of real time spatial dynamics that we once reached 20 years ago.
phil21 3 hours ago
Spatial audio is pretty good these days on some titles. It's just most folks don't have a sound system that can really do much with it. Headphones can only do so much in this arena.
Couple a modern AAA title (like Battlefield 6, etc.) with a proper Atmos sound system and you will likely be pretty amazed. Even a simple 5.1 setup is pretty decent for hearing footsteps behind you/etc. which actually does help with gameplay.
I haven't kept up on it as my computer gaming area doesn't lend itself towards a proper speaker setup these days, but playing with headphones on lately has made me start to look into this again. I need to find some high quality tiny cube speakers or something to be able to put in weird spots on the ceilings/walls.
mtucker502 8 hours ago
What progress is being made in overcoming the current thermal limits blocking us from high clock rates (10Ghz+)?
sparkie 5 hours ago
That's not going to happen, but there's alternative research such as [1] where we get rid of the clock and use self-timed circuits.
vessenes 7 hours ago
Like any doubling rule, the buck has to stop somewhere. Higher energy usage + smaller geometry means much more exotic analog physics to worry about in chips. I’m not a silicon engineer by any means but I’d expect 10Ghz cycles will be optical or very exotically cooled or not coming at us at all.
adrian_b 7 hours ago
Reaching 10 GHz for a CPU will never be done in silicon.
It could be done if either silicon will be replaced with another semiconductor or semiconductors will be replaced with something else for making logical gates, e.g. with organic molecules, to be able to design a logical gate atom by atom.
For the first variant, i.e. replacing silicon with another semiconductor, research is fairly advanced, but this would increase the fabrication cost so it will be done only when any methods for further improvements of silicon integrated circuits will become ineffective or too expensive, which is unlikely to happen earlier than a decade from now.
Hikikomori 5 hours ago
FpUser 6 hours ago
Having RAM read / write faster will be of way more benefit
amelius 5 hours ago
There have been overclockers who reached 9GHz using liquid helium.
It's simply impossible at room temperatures without extreme cooling.
Also you will run into interconnect speed issues, since 10GHz corresponds to .1 nanoseconds which corresponds to 3 centimeters (assuming lightspeed, in reality this is lower).
So sadly, we'll be stuck in this "clock-speed winter" for a little longer.
brennanpeterson 7 hours ago
None for normal.compute, since energy density is still fundamental. But the interesting option is cryogenic computing, which can have zero switching energy, and 10s of GHz clock rates
Some neat startups to watch for in this space.
magic_man 7 hours ago
The energy consumed is cv^2f. It makes no sense to keep increasing frequency as you make power way worse.
dlcarrier 6 hours ago
At lower frequencies, leakage current plays a larger role than gate capacitance, so for any given process node, there's a sweet spot. For medium to low loads, it takes less power to rapidly switch between cutting off power to a core, and running at a higher frequency than is needed, than to run at a lower frequency.
Newer process nodes decrease the per-gate capacitance, increasing the optimal operating frequency.
vlovich123 7 hours ago
So heat. There’s efforts to switch to optics which don’t have that heat problem so much but have the problem that it’s really hard to build an optical transistor. + anywhere your interfacing with the electrical world you’re back to the heat problem.
Maybe reversible computing will help unlock several more orders of magnitude of growth.
HarHarVeryFunny 7 hours ago
What would be the benefit? You don't need a 10GHz processor to browse the web, or edit a spreadsheet, and in any case things like that are already multi-threaded.
The current direction of adding more cores makes more sense, since this is really what CPU intensive programs generally need - more parallelism.
michaelt 5 hours ago
Because someone decided to write all the software in javascript and python, which don't benefit from the added cores.
nurettin 6 hours ago
Single core speed is absolutely a thing that is needed and preferred to multicore. That's why we have avx, amx, etc.
KeplerBoy 5 hours ago
vaylian 6 hours ago
You technically don't even need a 300MHz processor for the use cases that you name. But Intel and others kept developing faster CPUs anyway.
moffkalast 5 hours ago
For parallelism we already have SIMD units like AVX and well... GPUs. CPUs need higher single thread speeds for tasks that simply cannot make effective use of it.
hulitu 3 hours ago
> You don't need a 10GHz processor to browse the web, or edit a spreadsheet,
To browse the web is debatable. But for svchost.exe, Teams, Office 365 and Notepad, you definitely need one. /s
Programming is a lost art.
dd_xplore 8 hours ago
I remember back in 2006 I used to browse overclock forums to overclock my pentium 4, I tons of fun consuming lots of instructions, I learned the bios, changed PLL clocks, mem clocks etc.
rckclmbr 7 hours ago
I bought a car radiator and dremeled out my case, visited Home Depot for all the tubes and connectors. It’s too easy nowadays to add watercooling
fleventynine 5 hours ago
I upgraded to this exact CPU from a 200MHz pentium in the fall of 2000. Easily the largest jump in performance of any upgrade I've ever done.
paulryanrogers 6 hours ago
My first 1GHz was an AMD, also my first non-Intel, and its required fan was so loud that I was glad to get rid of it.
The speed was nice, and some competition helped lower prices.
herodoturtle 6 hours ago
I remember upgrading my 486 DX2 66Mhz to a DX4 100Mhz and all of a sudden being able to run winamp and Quake. That felt pretty epic at the time.
davidee 6 hours ago
I have very fond memories of my first dual-cpu Athlon machine.
It was the workstation on which I learned Logic Audio before, you know, Apple bought Emagic. I took that machine, running very low latency Reason to live gigs with my band.
Carting around a full-tower computer (not to mention the large CRT monitor we needed) next to a bunch of tube Fender & Ampeg amps was wild at the time. Finding a good drummer was hard; we turned that challenge into a lot of fun programming rhythm sections we could jam to, and control in real-time, live.
nikanj 4 hours ago
The craziest thing is, I don't actually know how many gigahertz either my PC or my macbook are. The megahertz race used to fierce!
myself248 4 hours ago
It's essentially random at any given moment. If I peek, mine will say it's running anywhere between 700MHz and 3.4GHz. Sometimes I think it goes even faster, but only if it's weirdly cold at the time.
jmyeet 5 hours ago
I have a hard time remembering what computers I had in the 1990s now. I had an 8086 in the 1980s. I think the next one I had was a 486/33 in the early 90s and I had this for years. I remember having a Cyrix 586 at some point later. I think the next jump was in the early 2000s and I honestly don't rmeember what that CPU was so I can't say when I got my first 1GHz+ CPU. Probably that 2002 PC. No idea what it was now. But it did survive in some form for another 12 years.
Fun fact #1: many today may not know that the only reason switched to the Pentium name was because a court ruled that they couldn't trademark a number and AMD had cross-licensed the microarchitecture and instruction set to AMD and Cyrix.
It was the Pentium 4 when clock speeds went insane and became a huge marketing point even though Pentium chips had lower IPC than Athlons (at that time). There was a belief that CPUs would keep going to 10GHz+. Instead they hit a ceiling at about ~3GHz, that's barely increased to this day (ignoring burst modes).
Intel originally intended to move workstations and servers to the EPIC architecture (eg Merced was an early chip in this series). This began in the 1990s but was years delayed and required writing software a very particular way. It never delievered on its promise.
And AMD, thanks to the earlier cross-licensing agreement, just ate Intel's lunch with the Athlon 64 starting in 2003 by adding the x86_64 instructions, which we still use today.
Fun Fact #2: it was the Pentium 3 that saved Intel's hide long after it was discontinued in favor of the Pentium 4.
The early 2000s were the nascent era of multi-core CPUs. The Pentium 3 had survived in mobile chips and become the Pentium-M and then the Core Duo (and Core 2 Duo later). This was the Centrino platform and included wireless (IIRC 802.11b/g). The Pentium 4 hit the Gigahertz ceiling and EPIC wasn't going to happen to Intel went back to the drawing board, revived the mobile Pentium-3 platform, adding AMD's 64 bit instructions and released their desktop CPUs. Even modern Intel CPUs are in many ways a derivation of the Pentium-3 [1].
[1]: https://en.wikipedia.org/wiki/List_of_Intel_Core_processors
1970-01-01 7 hours ago
Argh. The headline. The opener. Awful. Where are editors in 2026? There's no way an LLM would write this.
The GHz barrier wasn't special. What was much more important was the fact that AMD was giving Intel a hard time and there was finally hard competition.
adrian_b 7 hours ago
In terms of marketing, the "GHz" barrier was special, because surpassing it has indeed created a lot of recognition in the general public for the fact that the AMD Athlon CPUs were better than the Intel Pentium III CPUs.
In reality, of course what you say is true and the fact that Athlon could previde a few extra hundreds of MHz in the clock frequency was not decisive.
Athlon had many improvements in microarchitecture in comparison with Pentium III, which ensured a much better performance even at equal clock frequency. For instance, Athlon was the first x86 CPU that was able to do both a floating-point multiplication and a floating-point addition in a single clock cycle. Pentium III, like all previous Intel Pentium CPUs required 2 clock cycles for this pair of operations.
This much better floating-point performance of Athlon vs. Intel contrasted with the previous generation, where AMD K6 had competitive integer performance with Intel, but its floating-point performance was well below that of the various Intel Pentium models (which had hurt its performance in some games).
dlcarrier 6 hours ago
AMD being competitive at the time is what mattered, but there's still technological advancement needed for them to be competitive. In this case, it was AMD using copper interconnects that allowed them to not only hit 1 GHz, but quickly clock up from there: https://en.wikipedia.org/wiki/Athlon#Original_release
HarHarVeryFunny 7 hours ago
There was a time where increased clock speeds, or more generally increased processor throughput was important. I can remember when computers were slow, even for things like browsing the web (and not just because internet connection speeds were slow), and paying more for a new faster computer made sense. I think this time period may well have lasted roughly until the "GHz era" or thereabouts, after which even the cheapest, slowest, computers were all that anybody really needed, except for gamers where the the solution was a faster graphics card (which eventually lead to GPU-computing and the current AI revolution!)
1970-01-01 7 hours ago
You're conflating a few things here. The Vista era was the biggest requirement hit. That was the time where people really needed a faster PC to continue browsing. Before that, you could get away with XP running on a sub-GHz processor.