Yahoo Answers is shutting down on May 4th, 2021 (Eastern Time) and beginning April 20th, 2021 (Eastern Time) the Yahoo Answers website will be in read-only mode. There will be no changes to other Yahoo properties or services, or your Yahoo account. You can find more information about the Yahoo Answers shutdown and how to download your data on this help page.

au
Lv 4
au asked in Science & MathematicsEngineering · 7 years ago

Need help with selecting resistor values for a voltage divider?

I am trying to build a voltage divider with an input voltage of 15V and an output voltage of 12V. I want the power to be at least 1.6W. I have been trying different combinations of resistors and used online calculators, but nothing really gives me the required power. Can someone help me with this, please

Update:

Here's what i'm trying to do. I am building a mini-projector. I have this RGB LED lens box which i grabbed from a used projector (http://imgur.com/a/vmBJU). Each LED needs 12V to power up. I am trying to power the red and green LEDs by providing 12V to each but only the red LED lights up and the green LED doesn't light up. This is because of a voltage drop across the red LED which isn't passing 12V to the green LED. So I wanted to use a voltage divider, so that I can start with higher voltage of

Update 2:

15V and power both the LEDs together. Might not be the correct approach but couldn't think of anything else.

Update 3:

Here's the correct link: http://imgur.com/a/vmBJU

2 Answers

Relevance
  • Joe
    Lv 7
    7 years ago

    Designing a voltage divider to do that is trivial. But, as soon as you connect an external device, you're going to change the voltage. If your external device is a known, constant power draw, you can design for it in-circuit. But, if the load is going to change (e.g., your device contains a battery that will stop charging when it's full) your voltage will change from your intended 12 Volts.

    You can not use a voltage divider in place of a regulated power supply. If that is not your intent, provide more details and I'll try to help you.

    EDIT, 14-AUG-14:

    I saw your update; thank you. Yes, putting different LEDs in parallel like that won't work.

    Each LED needs an individual "current limiting resistor", calculated to pass the current you desire, given a voltage drop that's the difference between the supply voltage, and the LED voltage drop. Then, you can connect the two resistor-LED series combinations in parallel. (You've the blue LED, too, right?)

    You'd normally get the LED voltage drops and current range off of the spec sheet, and use those values to calculate the current limiting resistor values. Since you don't have that, I'll suggest that you use trial-and-error with each LED, then connect them both in parallel.

    You could also run some tests with one or two "likely" resistor values (like 100 or 200 Ohms), take measurements and determine the voltage and current characteristics of the LEDs that way. Then, calculate the resistors using those parameters.

    If you're going to go with trial-and-error, I'll suggest that you stick with 12 Volts (which you know won't fry the LEDs) and maybe 100 Ohms for the red LED and 90 Ohms for green. Those are just my guesses, of course. If you don't get full illumination, I think that increasing the supply voltage would be better than reducing the resistor values.

  • 7 years ago

    To convert 15V to 12V at about 140mA, use a 2.7V zener 400mW in series with load. Connect it such that the zener is reverse biassed. connect cathode to 15V.

Still have questions? Get your answers by asking now.