Intel A770, A750 review: We are this close to recommending these GPUs

Intel inside — Intel A770, A750 review: We are this close to recommending these GPUs New GPU series, ranging from $289 to $349, is less “amazing” and more “interesting.”

Sam Machkovech and Andrew Cunningham – Oct 5, 2022 1:00 pm UTC Enlarge / We took our handsome pair of new Arc A700-series GPUs out for some glamour shots. While minding standard static-related protocols, of course.Sam Machkovech reader comments 141 with 97 posters participating, including story author Share this story Share on Facebook Share on Twitter Share on Reddit

What’s it like owning a brand-new Intel Arc A700-series graphics card? Is it the show-stopping clapback against Nvidia that wallet-pinched PC gamers have been dreaming of? Is it an absolute mess of unoptimized hardware and software? Does it play video games? Further ReadingRumors, delays, and early testing suggest Intels Arc GPUs are on shaky ground

That last question is easy to answer: yes, and pretty well. Intel now has a series of GPUs entering the PC gaming market just in time for a few major industry trends to play out: some easing in the supply chain, some crashes in cryptocurrency markets, and more GPUs being sold near their originally announced MSRPs. If those factors continue to move in consumer-friendly directions, it will mean that people might actually get to buy and enjoy the best parts of Intels new A700-series graphics cards. (Sadly, limited stock remains a concern in modern GPU reviews. Without firm answers from Intel on how many units it’s making, were left wondering what kind of Arc GPU sell-outs to expect until further notice.)

While this is a fantastic first-generation stab at an established market, its still a first-generation stab. In great news, Intel is taking the GPU market seriously with how its Arc A770 (starting at $329) and Arc A750 (starting at $289) cards are architected. The best results are trained on modern and future rendering APIs, and in those gaming scenarios, their power and performance exceed their price points.

Yet our time with both Arc-branded GPUs has been like picking through a box of unlabeled chocolates. While none of our testing results were necessarily revolting, a significant percentage tasted funny enough to make a general recommendation pretty tricky. Table of Contents Warning: Intel buyers will want (if not need) a ReBAR-compatible PC How we got here More context: Major hires, plus an integrated reputation to shed The (not-fully-baked) Arc Control app Lets do the numbers At least one example of Intels ArcRedemption Hark! The Arc has some rough waters ahead With Arc, its usually not a shame about ray (tracing) The API alibi More quirks, more codecs, more games Verdict: Adventure awaits, if thats what you want to pay for Warning: Intel buyers will want (if not need) a ReBAR-compatible PC A brief unboxing and examination of Intel’s new Arc A700-series GPUs. These boxes sure look like shirts you’d see in ’80s family photos. Sam Machkovech This shot was taken after one of the “Let’s Play” information packets had been removed. 1 x HDMI 2.1, 3 x DP 2.0. Twin fan blower, aimed downward when installed in a typical ATX case.

There’s a lot to get into with Intel’s latest major entry into the GPU market, and it’s important to start by addressing a considerable barrier to entry for potential customers. Advertisement

Intel strongly urges buyers of its new Arc graphics card line to triple-check their computers support for a pair of relatively recent features: Resizable BAR (“ReBAR”) and/or Smart Access Memory. We say “and/or” because they’re branded versions of the same technology. The shortest explanation is that a ReBAR-compatible motherboard can send much larger chunks of data to and from the graphics card on a regular basis, and Intel would really like you to turn the feature on if possible.

Will the Arc A750 and Arc A770 graphics cards workwithout Resizable BAR enabled? Yes, but we don’t recommend it. Intel’s Arc architecture leans heavily into ReBAR’s wide-open pipeline to your GPU’s frame bufferso much so that it doesn’t have a fallback when a game’s workload includes constant streaming of assets like textures. The best example I found was in driving along Cyberpunk 2077’s sprawling highways at high speeds. With ReBAR enabled on my AMD Ryzen 7 5800X system, I could expect smooth-enough driving at 1440p with “high” settings enabled and ray tracing disabled. (This test’s “1 percent low” frame rate count, indicating the worst persistent dips, measured above 30 fps, which is pretty good.)

I thenrebooted, disabled ReBAR on the BIOS level, and played the same Cyberpunk segment again. The result was nigh unplayable, thanks to constant multi-second pauses and chugs. To give this scenario a fair shake, I immediately reloaded the save file in question and tried again in case this was a matter of one-time shader compilation causing the stutters. The bad numbers persisted between the tests.

Should your favorite games revolve around tight corridors or slower runs through last-gen 3D environments, the Arc GPU difference between ReBAR enabled and disabled can range from a margin-of-error sliver to a 1015 percent dip. But even if you can stomach those issues, you might run into significant quirks outside of gaming. In my case, Google Chrome and Microsoft Edge would both routinely glitch with ReBAR disabled while videos played in any tab. The whole browser window would turn into jibberish while the rest of the OS environment remained intact. It looked like this: Enlarge / When I left ReBAR disabled on my Intel Arc A700 series testing rig, my web browser would glitch out for 23 seconds at a time (in a way that I couldn’t capture using a Windows shortcut key combination) while the rest of the OS environment looked normal. Once I enabled ReBAR, this glitch never appeared again.Sam Machkovech

The only fix for this error was to enable ReBAR. If you don’t have a relatively recent CPU-and-motherboard combo that supports either ReBAR or Smart Access Memorybasically, Intel’s 10-series and up or AMD’s Ryzen 3000-series and upthe rest of this review may be moot for you.

That’s an unfortunate brick wall for a PC gaming market ominated by budget-priced CPUs and GPUs. Page: 1 2 3 4 5 6 7 8 Next → reader comments 141 with 97 posters participating, including story author Share this story Share on Facebook Share on Twitter Share on Reddit Sam Machkovech Sam has written about the combined worlds of arts and tech since his first syndicated column launched in 1996. He can regularly be found wearing a mask in Seattle, WA. Email sam.machkovech@arstechnica.com // Twitter @samred Advertisement

You must login or create an account to comment. Channel Ars Technica ← Previous story Next story → Related Stories Today on Ars