🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

New CPU

Started by
25 comments, last by Glandalf 24 years, 5 months ago
The main reason that Intel''s new chip will not be compatible is not because they didn''t want it to be, but because the current state of the P6 core is a mess. You can think of every stage, from 8086 to 80286, to 80386, all the way to the current P3 as an add-on to the existing package. While this is not exactly correct, it helps to visualize what they''ve done. The maintained backwards compatibility by, to use an OOP term, employing inheritance. The 80486 inherits the behaviour of the 80386, and adds new functionality on top of that. This resulted in an architecture, and an assembly language syntax, that is spaghetti-like in it''s complexity and ultimately impossible to update. The 64-bit RISC design is a clean slate, a return to simplicity. We shall see...
------When thirsty for life, drink whisky. When thirsty for water, add ice.
Advertisement
you guys are getting way off the question that gandalf posted. just a few things to say you can upgrade your 133 to like a 300 for the time being for like 50 bucks or another reasonable processor aroung that cost. also the new intel chip i here is not based on the x86 architecture its a whole new architecture.
wow.. a lot of people making up things as they go eh?

first i''ll start with the 64 bit intel chip. it emulates all of the 32 bit and 16 bit x86 code in hardware. this means everything will run on it, but with a performance hit. since it''s emulated in hardware and not through software, the performance hit won''t be very big. also, it''s not RISC or CISC. it''s EPIC. one more thing, it''ll cost too much for home users till around 2002 or 2003 unless you want to pay more than $15k for a home pc. it''s meant for high-end servers and workstation at the start. intel has a new 32bit architecture coming out this year which is suppose to have 1MB full speed cache and based on their 64bit architecture but is 32bit and runs the same x86 instructions.

now to try to answer the original post... if you''re on a budget, go with a single celeron 466, 500, or 533 system. it''s much much faster than your p133 and it wont break the bank. otherwise you''ll have to buy something much more expensive. if you want the fastest(expect BIG $$$ at 1st like all new CPUs), wait a couple months either for Athlons with on die full speed cache and 266Mhz FSB, AGP 4x, and DDR-SDRAM support, or wait for the next generation Intel 32bit which just might haul ass in today''s apps. it''s not based on the P6 arch. it''s the design between p3 coppermine and itanium. i believe it''s called willamette or something like that.

later.....
If you wait, you wait forever. Set a date and aim for it.
The current Celeron is almost dead. When I say that, I mean Celeron in its 66MHz bus incarnation. The 100MHz Celeron is nigh.
The current P3''s are beginning to fade. Bear witness to the evolution to the Coppermine. This will make Intel match the Athlon...
...Until the AMD changes the Athlon process and the CPU war rages on.
Let me just say that the Itanium will be FAR out of your monetary reach. It would be like buying a Xeon. Look at how much a Xeon costs.

1GHz looks feasible by the middle of the year. Easy. Intel roadmaps 1GHz by the end of the year, but with AMD nipping at their heels? Or they could both make a silent agreement to milk whatever value out of their processors and the consumer... but I think AMD wants to do some fragging. 1GHz is nigh I tell you!

As for motherboard, well, a fantastic CPU is nothing without a good motherboard. One can only judge the BX boards as the Athlon boards are not for the weak of stomach. Unfornately, Coppermine native m/b (as opposed to BX boards with a converter to take the Coppermine) are meant to be used with RAMBUS which costs a heap. Sure, their back compatible with SDRAM, but there is a performance hit.

Also, it is shown that as *now*, UDMA66 and AGP4x have barely noticeable performance advantages. But then you lose out on bragging rights.

Either way, you have a *bit* of a wait. At least until July, if AMD don''t take some potshots at Intel...

If you can hold out, sure wait. If not, buy now for as cheap as it will be able to handle whatever you can throw at it (be it Celeron/Pentium or Athlon) and save like you were going for that Itanium. Then when the Itanium comes out, you should have enough to buy a whole system.

Of course, I could be completely wrong.
JeranonGame maker wannabe.
Intel is doing this partly because Sony is going to release a 128 bit architecture and chip that will dwarf the capabilities of any 32 bit chip. Granted the Playstation 2 is a console machine but it is not a far stretch to think Sony could adapt the architecture for use in the PC world. In fact I believe I have heard rumors about this happening on the gamasutra website. Anyway in order to compete with AMD and Sony and the other chip manufacturers, we all knew there would be a time where Intel would shed its excess instruction set baggage. That time is now and I am excited. We finally get to leave the x86 architecture behind and move on to something that is significantly better.

Kressilac


quote: Original post by ZomeonE

I''m not in the position to have 2 pc''s.
I can''t understand why Intel would make a cpu which isn''t backwards compitable with all software today.
I''m not sure I will get it if I can''t run the software I have today. I''ll just hope that Intel will do something about that, so that people like me can run their old software that they''ve paid a lot of money for.


Derek Licciardi (Kressilac)Elysian Productions Inc.
its to bad amd isnt doing the same with there new 64bit chip which is still based around the x86 architecture
Yeah, yet AMD figured that keeping backwards compatibility with a huge amount of applications built for x86''s was important. AMD has shown that it can design superior processors to Intel. I had not heard of Intel supporting the old x86 architecture on their Itanium natively. But you can bet that AMD will do it, and do it fast.. and well.

There is no reason to cast off an entire instruction set, and AMD will prove that.

---
Michael Tanczos

This topic is closed to new replies.

Advertisement