Communities

Writing
Writing
Codidact Meta
Codidact Meta
The Great Outdoors
The Great Outdoors
Photography & Video
Photography & Video
Scientific Speculation
Scientific Speculation
Cooking
Cooking
Electrical Engineering
Electrical Engineering
Judaism
Judaism
Languages & Linguistics
Languages & Linguistics
Software Development
Software Development
Mathematics
Mathematics
Christianity
Christianity
Code Golf
Code Golf
Music
Music
Physics
Physics
Linux Systems
Linux Systems
Power Users
Power Users
Tabletop RPGs
Tabletop RPGs
Community Proposals
Community Proposals
tag:snake search within a tag
answers:0 unanswered questions
user:xxxx search by author id
score:0.5 posts with 0.5+ score
"snake oil" exact phrase
votes:4 posts with 4+ votes
created:<1w created < 1 week ago
post_type:xxxx type of post
Search help
Notifications
Mark all as read See all your notifications »
Q&A

Welcome to the Power Users community on Codidact!

Power Users is a Q&A site for questions about the usage of computer software and hardware. We are still a small site and would like to grow, so please consider joining our community. We are looking forward to your questions and answers; they are the building blocks of a repository of knowledge we are building together.

Understanding integrated/low-end GPU performance for budget upgrades

+1
−0

My current computer is about 10 years old. It suits my current needs well enough, but in the future I'd like to do a bit more, so I'm think about some budget upgrades for CPU, motherboard and memory. Generally I don't do the sort of "gaming" that would benefit, but I have an interest in video editing.

I considered the machine mid-range when I got it, although I went with a relatively inexpensive video card. This was on top of onboard graphics (I couldn't track down a suitable CPU without).

Now that I have more experience, and a better idea of the options available to me, I'm debating between:

  • Getting a CPU (or APU) with onboard graphics, and possibly setting aside my old video card;

  • Getting a slightly better performing CPU without onboard graphics support, and keeping the card.

In general terms:

  • Can I expect my old graphics card to work with modern budget CPUs? What specific compatibility issues might I have to look out for, or how can I research this?

  • How would the power of integrated graphics on these CPUs likely compare to my old graphics card? I can easily find benchmarks for separate video cards and (by a different metric) the actual CPU performance of CPUs, but not for the video performance of integrated GPUs.

  • Notwithstanding that, if I get a CPU with integrated graphics, would keeping my existing video card plugged in likely make a noticeable difference in performance? If it wouldn't, I think I might prefer to save the power draw.

History
Why does this post require attention from curators or moderators?
You might want to add some details to your flag.
Why should this post be closed?

0 comment threads

3 answers

You are accessing this answer with a direct link, so it's being shown above all other answers regardless of its score. You can return to the normal view.

+2
−0

I'm a big believer in benchmark vs. cost, so I would suggest the following approach:

  1. Determine "current power" of the hardware you currently have. Do some representative task on it and measure performance. Say you care about encoding video, take a sample video file, and clock it at 2 minutes to encode.
  2. Determine "target power" that you want from new hardware. Decide how much better you want the task to go. Say you would desire to encode the sample file in 20 seconds, or 5x faster.
  3. Go on https://www.videocardbenchmark.net/ and figure out the benchmark score of your current hardware. Then find cards that are accordingly more powerful. Continuing our video example, you want benchmark scores that are ~5x better.
    • Benchmark scores are not always linear. So 2x passmark score might not mean 2x more performance. I consider FPS in games to be a good linear baseline - usually, pushing 2x pixels/sec really does mean 2x performance, and it's not hard to find FPS benchmarks. You can then compare benchmark scores to FPS, to figure out the (non)linearity of that particular benchmark.
  4. Of the cards you selected, pick one with a good $ / benchmark score value.
  5. Confirm that the card is compatible with your hardware from https://pcpartpicker.com/. PCIe is pretty standard now, but there's a lot of corner cases. When buying discrete, keep in mind that most need their own power rail, so you might need to also upgrade your PSU.

The best value is usually the upper end of mid-range cards from ~1y ago.

  • High end cards sacrifice efficiency to win the "fastest card" race and get headlines, so they end up overpriced (and as an early adopter, you pay, proportionately, the lion's share of R&D)
  • Low end cards sacrifice performance to win the "cheapest card" race, and they don't try to compete on performance because people who care much about performance would get higher-end card
  • When the newest generation comes out, sometimes the previous generation drops in price because all the people with money to burn are now moving on the hot new stuff and they want to clear inventory (that said, we're coming out of a multi-year supply chain disruption for GPUs, so they are very overpriced right now either way)

But technology advances rapidly, so good mid and low end new cards will generally be better than high end old cards. Discrete will tend to be better than onboard or APU, but that will fall out of the benchmark scores anyway. Buying older cards (>2-3 years) is usually impractical, because scalpers drive the price up. But again, this will fall out of the performance/$ statistic.

History
Why does this post require attention from curators or moderators?
You might want to add some details to your flag.

0 comment threads

+2
−0

You're in a bit of a period of transition - and it's a little more complicated than one would assume. A decade is a very long time, and both 'big' CPU makers are transitioning towards much more capable architectures - Intel ARC is much more capable than older Intel onboard, and AMD is good enough. If you're doing transcoding, Intel also has Quick Sync on board, which is handy. I think both could run circles around say, a Nvidia 6xx or its AMD equivalent.

Can I expect my old graphics card to work with modern budget CPUs? What specific compatibility issues might I have to look out for, or how can I research this?

The hardware should still be compatible. There's a distinct chance that an onboard graphics solution is better. However - a lot of older cards from around that era are likely not supported driverwise on a modern platform. If they're not EOL already, they will be soon.

How would the power of integrated graphics on these CPUs likely compare to my old graphics card? I can easily find benchmarks for separate video cards and (by a different metric) the actual CPU performance of CPUs, but not for the video performance of integrated GPUs.

Possibly on par or better on the bottom most end. I'm pretty sure the really cut down GPU on my 3rd gen (we're at 9 now?) laptop grade AMD Ryzen is good enough for very very low end gaming.

Notwithstanding that, if I get a CPU with integrated graphics, would keeping my existing video card plugged in likely make a noticeable difference in performance? If it wouldn't, I think I might prefer to save the power draw.

On the medium/low end 10 years ago, I doubt it's going to be a good option. Might as well get a 'better' new system and run that.

History
Why does this post require attention from curators or moderators?
You might want to add some details to your flag.

0 comment threads

+1
−0

Can I expect my old graphics card to work with modern budget CPUs? What specific compatibility issues might I have to look out for, or how can I research this?

Yeah sure it will very likely work, particularly if it's from one of the big two: Nvidia or AMD (usually named GeFeorce or Radeon + a bunch of numbers). The only issue I can think of is that drivers might not be updated any longer, but that's mostly a concern when gaming.

It's trickier to buy a new graphics card for an old PC, since newer cards might consume more power and you might have to upgrade the PSU as well at that point. I used to do this for gaming purposes but it was such a chore resolving all problems that came with it, that it simply wasn't worth it & I might as well upgrade to a new PC. Replacing individual boards is only worth it if you actually enjoy fiddling around with PC building and troubleshooting.

How would the power of integrated graphics on these CPUs likely compare to my old graphics card? I can easily find benchmarks for separate video cards and (by a different metric) the actual CPU performance of CPUs, but not for the video performance of integrated GPUs.

Assuming you mean performance, it is pretty safe to assume it is subpar at best. That's not really something you should go for unless you only plan to use the computer for browsing the web or using MS Office type of programs (or doing programming :) ).

Benchmarking and graphics card performance has traditionally been a bunch of non-scientific voodoo. Frames per second is quite a subjective metric. Certain programs use certain aspects of graphics more extensively, so performance is all about fine-tuning the graphics settings for your particular program. Better cards give more room to maneuver so to speak.

When the newest generation comes out, sometimes the previous generation drops in price because all the people with money to burn are now moving on the hot new stuff and they want to clear inventory (that said, we're coming out of a multi-year supply chain disruption for GPUs, so they are very overpriced right now either way)

Yep. I've been doing PC gaming since before external graphics cards became a thing and the only thing that's been consistent with such cards is the incredible hype. Every new generation is ridiculously hyped but the actual improvements are usually marginal. They do get better over time though, especially over a 10 year period.

Now when Nvidia or AMD releases a new generation, there's always a ridiculously overhyped and overpriced flagship product which one should simply dismiss. Then there's the 2nd or 3rd best one which will be overpriced too, which would only benefit hardcore gamers that play a lot of graphic-intense games. For general gamers who just want a really nice card, you'd go for one in the mid-range, which is just very marginally worse than its bigger brothers.

You could also consider buying the previous generation which will be cheaper but with a couple of years less in life time.

Notwithstanding that, if I get a CPU with integrated graphics, would keeping my existing video card plugged in likely make a noticeable difference in performance? If it wouldn't, I think I might prefer to save the power draw.

Really hard to tell. Graphics cards haven't improved -that- much over 10 years, so a 10 year old gamer card might possibly outperform an integrated modern one, but I wouldn't count on it.


The general recommendation is that you should probably consider buying an entirely new PC. With a 10 year old PC you might not have a SSD disk(?) or if you do a somewhat experimental one with poor data retention.

Apart from that, electronics do age. If you are taking well care of the internals and carefully vacuum it with great care for ESD, then it will probably last long. If you don't, well it's going to "clog up" from dust eventually. Components starting to fail due to shorts when it gets filled up with dust, or failing due to plain aging. Electrolyte caps in your PSU and motherboard are particularly prone to fail after spending 10+ years in a hot environment.

It's all about what budget you have from there.

History
Why does this post require attention from curators or moderators?
You might want to add some details to your flag.

1 comment thread

Trying to clarify scope and intent (1 comment)

Sign up to answer this question »