The history of using multiple GPUs in PCs can be traced back to the early 2010s, when programs demanding higher graphics card performance began to emerge. Many enthusiasts installed dual GPUs to boost performance, supported by breakthrough technologies like SLI and CrossFire. However, with today’s single GPU performance being sufficiently powerful, is it still necessary or feasible to install two GPUs simultaneously in a computer?
NVIDIA ceased SLI driver support in 2021, having kept it on life support since 2016, limited to GTX 1070, 1080, and 1080 Ti GPUs, reducing support from four cards to two. The GeForce RTX 3090 is the only RTX 30-series GPU supporting SLI, with SLI not supported on the RTX 40 series.
As for CrossFire, AMD RX Vega was the last AMD GPU to support it. AMD has since stopped making drivers that support CrossFire for each game, rendering it impractical for setting up multi-GPU systems with AMD GPUs.
While some enthusiasts may still use multi-GPU systems, they are generally impractical from the outset. This technology allows for shared resource utilization across GPUs, but building a system that efficiently shares resources in tasks like gaming is often suboptimal. Forums and discussions highlight the complex nature of this technology, often sacrificing efficiency for marginal frame rate gains.
Technically, it’s still possible to install two or more GPUs in a PC, given compatible interfaces, sufficient PCIe slots, and power supply. However, there’s limited software support, and they typically cannot share resources efficiently or even be recognized by the OS if they are from different manufacturers.
Thus, while theoretically feasible, installing multiple GPUs in a PC today lacks practicality in many scenarios. Beyond lacking software support, combining GPUs may not yield the expected performance boost due to compatibility issues, potentially hindering overall system operation.
Related:
- AI Chip Innovations for Data Centers & Cloud in the Future
- Data Center GPUs Have Short Lifespan: 1-3 Years
- GPU vs CPU: Which is Better for Neural Network Training?
- AI Chips Evolve: Can ASICs Challenge GPU Dominance?
Disclaimer:
- This channel does not make any representations or warranties regarding the availability, accuracy, timeliness, effectiveness, or completeness of any information posted. It hereby disclaims any liability or consequences arising from the use of the information.
- This channel is non-commercial and non-profit. The re-posted content does not signify endorsement of its views or responsibility for its authenticity. It does not intend to constitute any other guidance. This channel is not liable for any inaccuracies or errors in the re-posted or published information, directly or indirectly.
- Some data, materials, text, images, etc., used in this channel are sourced from the internet, and all reposts are duly credited to their sources. If you discover any work that infringes on your intellectual property rights or personal legal interests, please contact us, and we will promptly modify or remove it.