Yahoo Answers is shutting down on May 4th, 2021 (Eastern Time) and beginning April 20th, 2021 (Eastern Time) the Yahoo Answers website will be in read-only mode. There will be no changes to other Yahoo properties or services, or your Yahoo account. You can find more information about the Yahoo Answers shutdown and how to download your data on this help page.

Building my own computer: Monitor not receiving signal?

My monitor came with a VGA to VGA port. My motherboard has a VGA and a DVI port available.

My graphics card has only a DVI port.

I just turned on my computer for the first time, lit up, fans ran fine on both the cpu and gpu.

After hooking my monitor up to the motherboard and pressing the power button (both for monitor and computer) the monitor failed to show anything from the computer.

It turned on and gave its "BenQ" purple start-up screen, but then went into powersaver mode as if there was no input. Then I unplugged the VGA connection and it showed that cancelled - no connection - screen.

I ll continue to check to make sure everything is connected properly, but any ideas of what the issue could be?

(Also do I still get the full benefit of the graphics card when I m connected to the motherboard instead?)

Update:

Thanks guys! I was thinking that might be the case, but didn't want to buy an adapter if there might be something else wrong!

2 Answers

Relevance
  • 5 years ago

    If you have a graphics card plugged in, you have to use your graphics card as the primary display driver. If you want to plug the vga into the motherboard, it will use integrated graphics on the cpu/motherboard instead of the graphics card. If you want to do that you need to remove the graphics card.

    The only option to use the graphics card is to buy a DVI to VGA adapter and use it on your graphics card. Or buy a monitor which supports DVI.

  • 5 years ago

    windows can use two video cards. but you have to choose the DEFAULT windows one in your BIOS. on some computers, adding a video card disables on-board video, however these days because on-board is now usually referring to the gpu thats part of the cpu, it no longer gets disabled automatically. so you do still have to nominate the default windows one, ie the INITIAL video card to use for DEFAULT screen output. this is usually called INIT in the BIOS settings, so you would chhose INIT: pci-e as the first video card(your add on card). from then on after you save the bios, windows will use the add on card. if you do not do this, it will never use your add on card.

Still have questions? Get your answers by asking now.