Welcome to the Power Users community on Codidact!
Power Users is a Q&A site for questions about the usage of computer software and hardware. We are still a small site and would like to grow, so please consider joining our community. We are looking forward to your questions and answers; they are the building blocks of a repository of knowledge we are building together.
Post History
Can I expect my old graphics card to work with modern budget CPUs? What specific compatibility issues might I have to look out for, or how can I research this? Yeah sure it will very likely wo...
Answer
#1: Initial revision
> Can I expect my old graphics card to work with modern budget CPUs? What specific compatibility issues might I have to look out for, or how can I research this? Yeah sure it will very likely work, particularly if it's from one of the big two: Nvidia or AMD (usually named GeFeorce or Radeon + a bunch of numbers). The only issue I can think of is that drivers might not be updated any longer, but that's mostly a concern when gaming. It's trickier to buy a new graphics card for an old PC, since newer cards might consume more power and you might have to upgrade the PSU as well at that point. I used to do this for gaming purposes but it was such a chore resolving all problems that came with it, that it simply wasn't worth it & I might as well upgrade to a new PC. Replacing individual boards is only worth it if you actually enjoy fiddling around with PC building and troubleshooting. > How would the power of integrated graphics on these CPUs likely compare to my old graphics card? I can easily find benchmarks for separate video cards and (by a different metric) the actual CPU performance of CPUs, but not for the video performance of integrated GPUs. Assuming you mean performance, it is pretty safe to assume it is subpar at best. That's not really something you should go for unless you only plan to use the computer for browsing the web or using MS Office type of programs (or doing programming :) ). Benchmarking and graphics card performance has traditionally been a bunch of non-scientific voodoo. Frames per second is quite a subjective metric. Certain programs use certain aspects of graphics more extensively, so performance is all about fine-tuning the graphics settings for your particular program. Better cards give more room to maneuver so to speak. > When the newest generation comes out, sometimes the previous generation drops in price because all the people with money to burn are now moving on the hot new stuff and they want to clear inventory (that said, we're coming out of a multi-year supply chain disruption for GPUs, so they are very overpriced right now either way) Yep. I've been doing PC gaming since before external graphics cards became a thing and the only thing that's been consistent with such cards is the incredible hype. Every new generation is ridiculously hyped but the actual improvements are usually marginal. They do get better over time though, especially over a 10 year period. Now when Nvidia or AMD releases a new generation, there's always a ridiculously overhyped and overpriced flagship product which one should simply dismiss. Then there's the 2nd or 3rd best one which will be overpriced too, which would only benefit hardcore gamers that play a lot of graphic-intense games. For general gamers who just want a really nice card, you'd go for one in the mid-range, which is just very marginally worse than its bigger brothers. You could also consider buying the previous generation which will be cheaper but with a couple of years less in life time. > Notwithstanding that, if I get a CPU with integrated graphics, would keeping my existing video card plugged in likely make a noticeable difference in performance? If it wouldn't, I think I might prefer to save the power draw. Really hard to tell. Graphics cards haven't improved -that- much over 10 years, so a 10 year old gamer card might possibly outperform an integrated modern one, but I wouldn't count on it. --- The general recommendation is that you should probably consider buying an entirely new PC. With a 10 year old PC you might not have a SSD disk(?) or if you do a somewhat experimental one with poor data retention. Apart from that, electronics do age. If you are taking well care of the internals and carefully vacuum it with great care for ESD, then it will probably last long. If you don't, well it's going to "clog up" from dust eventually. Components starting to fail due to shorts when it gets filled up with dust, or failing due to plain aging. Electrolyte caps in your PSU and motherboard are particularly prone to fail after spending 10+ years in a hot environment. It's all about what budget you have from there.