Laptop with 2 Graphics Chip?

Started by Hotshot, August 26, 2023, 09:43:01

Previous topic - Next topic

Hotshot

I have laptop that got 2 Graphics chip which one is Intel HUD Graphics and other is Geforce 3050.

IF I Disenabled Geforce 3050 and run AGK Classic with some program to run it and it work fine.

IF I Disenabled Intel HUD Graphics and Run AGK Classic with some program to run it and it doesnt work.

What I am trying to do is Disenabled intel HUD Graphics for Geforce 3050 because more of performance

The questions I would like ask is Should I UNSTALLED Intel HUD Graphics as then I would have Geforce 3050 as my Default graphics chip and it should run ok for AGK :)

But I have done research on the net where the man say this

"Let's clarify some terms. The Intel chip is your Internal Graphics Chip. The Nvidia is called the Discreet Graphics Chip.

Your laptop screen's image is always displayed through the internal graphics chip. When gaming or performing other heavy graphics jobs, the discreet graphics chip will do the heavy processing, and pass through the internal graphics chip to the display. This means the internal graphics chip is always on. It cannot be disabled. The discreet graphics chip is not always on. It can be disabled.

When you disable the discreet graphics chip, performance will suffer, but battery life will increase. "

So questions is why cant I UNSTALLED or disenabled intel HUD Graphics for Geforce 3050?

dawlane

#1
There are two place where you can enable/disable hardware.
1. On MS Windows; through the device manager.
2. Through the computers bios that you enter at boot.

You will need to read the laptops manual or the laptops manufactures website on how enter bios. For EFI boot systems you usually have to hold down a certain key to bring up a boot menu and then select EFI boot.

Hotshot

#2
Thanks for reply.

I did went to Device manager to Disenabled intel  HUD Graphics for Geforce 3050 and when I was using AGK to run some program and I get quite lots of error due disenabled Intel HUD Graphics as I want Geforce 3050 to be default because I would get more performance!

disenabled Intel HUD Graphics as I want Geforce 3050 to be default because I would get more performance!

I have check the BIO and there is Enable Adaptive C-State for Discrete Graphics that is ON and do I turn this OFF?

This is Geforce 3050 set to Default as I have disenabled intel Hud Graphics and I get this

click image to zoom it :)

dawlane

#3

Hotshot

Thanks dawlane


My laptop is G15 511 as intel CPU with 16GB and Geforce 3050(if I knew that it had 2 graphics chips then I wouldn't have bought it)

TomToad

Most modern gaming and high performance laptops use two graphics chips.  The main chip contains the display buffers, DAC and other circuitry required to create a display.  The discrete graphics chip is a cheaper version, basically the same as the full version, but with all the display circuitry removed.  The mainboard circuitry and chipset is designed so the discrete chip can use the main chip's display logic without affecting performance.  This has the advantages of lower cost, less heat, and ability to use only one display regardless of which chip is active.  It's similar to running two graphics cards in a PC with a cable in-between.  One card used for display, and maybe some light processing, the other card used for accelerating 3D graphics.
            -----------
            | Monitor |
            |         |
            -----------
                 ^
                 |
-----------------------  -------------
| Intel GPU | buffers |  | NVidia GPU|
| backbuffer| DAC     |<-|backbuffer |
|accelerator|         |  |accelerator|
-----------------------  -------------


Generally, you should be using the drivers provided for your specific motherboard.  That way you'll know that whenever DirectX, OpenGL, or Volkan is being used, the NVidia will automatically be selected.  You probably can get away with using the drivers on NVidia's site, they are usually good at properly detecting the chipset used.  To test, you could run a 3D benchmark program, then disable the NVidia and run the benchmark again to see if performance decreases.

------------------------------------------------
8 rabbits equals 1 rabbyte.