This product was mentioned in
with an average of
Popularity since 2010
I think this is the best solution. Pay the $100 this month in good faith and agree this is how you will do it moving forward. Would only work if Home is where you exclusively charge though.
Or get a kill a watt... P3 P4400 Kill A Watt Electricity Usage Monitor [link]
In the house i lived in growing up, we had an old refrigerator that once shut off pulled our electric bill by $100.
My only suggestion is to do one room at a time and see how that affects things. Or you can buy a Wattage Usage Monitor from Amazon and plug that into things to see how much they are using.
Look into a Kill A Watt. It will read voltage and other measurements on items that plug into it. They are $20 on Amazon. I keep one plugged in to a convenient outlet in our 5th wheel so I always know the AC voltage.
If you charge with the 110v charger that comes with the car, you can plug it into a <strong>Kill-a-watt</strong> meter for $20. That'll show you actual consumption. It'll even let you enter your electricity price and then it'll tell you how much money you're putting in the car for fuel.
Get one of this and find out how much electricity your appliances are using. I recently replaced my old refrigerator with a new one and saved about $10 a month. Your bill is high, but no so high that it couldn't be accounted for if you have old appliance or energy hogging light bulbs.
You can get a Kill-A-Watt meter for $20 which can measure voltage, current, and wattage for 120v and has a hold feature. As far as convenience goes it seems like a better deal.
One of these will help to figure out your power usage. It's very interesting to see how much power your appliances use over time.
Well, my 8320 rig pulls about 180W with those settings listed at full load (which I leave it on for BOINC). At stock, it was closer to 250W. I was worried about power usage when I first started but honestly I don't think its that much. If you are worried, get a Kill-a-Watt for it, that is how I measure mine.
For comparison, I believe its roughly equivalent to leave two 100W lightbulbs running year round, and plenty of people seem to have few qualms about that. My parents heat cold rooms in their house with 1000W space heaters and never worried about energy usage.
Also in the winter, the heat output doubles as a space heater too. I use mine to heat my bedroom, its freezing
Buy this ([link]) and test your outlets. You could also just use a voltmeter and make sure your voltages are in the 100-120 VAC range. Isolate the the problem by room/circuit. That should give you a starting point for determining the root cause.
Doubtfully the type of software solution that you're looking for, however, from my experience it is sometimes difficult to determine what's going on the hardware level based off of output that OSX may give you. If it were me, I'd look at the values from my kill a watt meter and if they looked abnormal, I'd compare them to what they were in another OS.
I also seem to recall doing a command somewhat like the one below whenever I did OSX on my Lenovo X61 tablet, which was by far the most difficult OSX project that I've ever done.
lshw | grep -i cstates
Your what is what? My PC runs 24/7, too. It draws 23W at idle, which is most of the time. It draws over 400W under heavy load, such as when I'm gaming. I have a 650W power supply. (measured using a kill a watt)
The big issue I have with your original comment is "that won't even power my house" is a nonsensical statement. I can run your entire house from a car battery or two and a few power inverters, but probably not for more than a few seconds. Electrical load (power) is measured in watts, but energy consumption and battery storage are measured in power*time, or watt-hours. So I asked "are you looking at the labels or your energy consumption?" and you tell me what the weather is like...
You want one of these:
Electric heat is the obvious culprit, but an old fridge can use a staggering amount of electricity.
For <$20 you can get a Kill a Watt and measure every 110V device in your house to see what devices are power vampires in both the on and off modes. Also, I would highly recommend plugging all electronic devices into a surge strip and simply turning off the surge strip when not in use.
At an extremely rough estimate, I would say your CPU's idle power consumption will be between 100 and 120 watts.
The only accurate way of knowing to to actually measure it with a device like this one: [link]
If you're all electric (power and heating/cooling) then you should be able to get to the source of the problem pretty easy. I'd start at the meter and take daily readings.
This is also a really useful tool for drilling down on what's using the most power: killaWATT
I have a 2000sqft 4br colonial that's reasonably well insulated.... I've burned about 1.5 cords of wood, but my last three BGE bills have been far lower 180/240/215
As others have pointed out, the rating on the power supply for things is usually an "absolute maximum," not a reflection of how much the device actually draws. These little guys are a pretty cheap way to get the real numbers, though, and it can occasionally be surprising.
If anything, I think your CPU will be the bottleneck. That being said, you could probably just get a $200 range card and be fine. You might not be able to max the graphics but you'll be able to tweak things and get it to where you like. I own a Kill A Watt and I found out that my PC does not pull no more than 350 watts. That's with an overclocked i5 CPU and a GTX 770 running full blast. I kind of went overkill with the PSU but that's not a bad thing. I have had too many bad experiences with cheap PSU's to know it's always a bad idea to use one. Even some PC subreddits encourage people to not recommend cheap PSU's. I guess what I am trying to say is you may be fine using the PSU you already have.
you can get a app that keeps track of elec.. some can read the meter.. depending on type of meter.. some you just daily put in the number on the meter.. to keep track.(mine cant be auto checked.. so I enter in the number... then i look at my bill.. i get so many kwh at one price, go over that amount the rest are charged at a higher rate... really just looking at 1 or 2 days.. and then multiple by 30 to get your total month KWH.. and then look how they charge you on the bill..like first 1500 at one rate.. etc.. and do the math)
you can then try to find what is using up all your elec.... its also easier to see if you are being charged correctly.
getting an kill-a-watt meter can help you find what is using the most elect. Things like turning down water heater, can help a fuck ton.
Also warm yourself more than your house. Yes wear more clothes inside. It helps to have an electric blanket..and turn down the heat at nice.. just heat your bed.
Close off vents for unused rooms... seal off windows.. etc.
last someone can be stealing your elec.
The GTX970 doesn't require a lot of power, so 850w sounds like overkill. Too little power will more likely give stability issues, rather than throttle your FPS.
It's relatively cheap and easy to see how much power your rig is pulling, with one of these:
Your local hardware store may even have them in stock.
I have a PNY GTX 770 with three fans. It hardly ever gets loud or hot. I feel like I got lucky buying it (paid $329 for it about a year ago I think.) That VRM cooling does the trick. I used a Kill A Watt out of curiosity and the PC barely pulls anything while idle. It pulls about 300 watts while max gaming. I thought it would pull a lot more so I kind of went overkill with the Corsair Enthusiast Series 650-Watt 80 Plus.
edit: When I said 'idle', I meant in sleep mode.
> I wonder if there might be an app or something (kind of like Nest) that could help people with power budgeting--appliances
You're looking for something like the Kill A Watt.
>a decent PC, and 24 inch TV might use about 500W
That's not true. That might be the max rated wattage of the PC but it doesn't consume that in a steady state regularly. I just built a PC with an i5-4660 and a GTX 960 and I plugged it into a Kill A Watt and was drawing about 50 watts and a lot of PC's today (like Apple, all in ones, laptops, and tablets) draw less power than that with mobile parts.
It might contain a "500W" power supply but that's far from drawing that continually on average.
something like this: [link]
plug it into the wall, then your power cable into it, observe and document your usage. If you just want an estimate, go over to pcpartpicker.com and fill in all of your components.
> Apparently my desktop computer, laptop, and three LCD monitors is sometimes more than 15A.
I'd replace the breaker...it seems really unlikely that load adds up to 15 amps.
Or more specifically, I'd plug in my Kill-a-watt meter and find out for sure.
Just because the PSU is rated at 700w doesn't mean it is constantly using that. You could check with something like this Kill A Watt, but unless you're going balls to the wall playing some serious games, encoding or other cpu/gpu intentive tasks, your computer is probably in the 50w range.
honestly you need to learn measuring electricity.. you might be being "misbilled" or scumbags are stealing from yah?
here's a example of one way to measure usage.. (but honestly you need to talk to electrical experts)
CoreTemp shows your cpu wattage, and other software like hwinfo64 shows some power stuff (not sure exactly what, don't remember and I'm on mobile atm). However those aren't always accurate afaik. Only way to know for sure is to buy something like this.
I am just saying that I have a CPU that takes 9 more watts, a GPU that takes at least 100 more watts not OC'd(it is)and other stuff listed with 100 less watts in a crappier PSU and run Witcher 3 or Farcry 4 GPU at 100 percent with no issues. I used to run benchmark stuff and stress test but it pushed the hardware more than any game ever will. Good luck to you. Also the 970 take about 5 more watts that your GPU.
You could pick up one of these and see how much you are pulling.
Invest in a "kill-a-watt" monitor, nothing else is going to give you an accurate idea of how much is actually being drawn. They're like $20 on Amazon so it's not like it's going to break the bank. Ignore the "calculators", they're just taking a guess, real readings are actually useful.
If you go too far, what happens depends on how far. Typically, it's just a lot of heat, and whatever is going to be blown up by that heat. It might just shut down, but generally, everything gets really hot (exceeding the PSU's wattage causes a voltage drop in the rails, which means the components demand more amperage to create the same wattage, and amperage=current=heat.)
It's totally possible to fry it all. Unlikely, but possible.
If you're concerned, going up to a 1000w isn't much more expensive considering your total build. ($50 extra)
If you're feeling the strain, just pick up a power meter to see how much power you're drawing at load.
If you ever get real curious these are really handy to have around. Cant say they pay for themselves or anything, but i really like mine. Lets you put into numbers just how much energy things around the house actually use.
Use the styrofoam if it is pink colored. That usually means it is anti-static and will protect the GPU. Otherwise DON'T use styrofoam, since that stuff is an ESD nightmare.
If you have a big or tall cpu cooler you might consider taking that off.
Or if you anticipate a very smooth drive mostly on the highway I think you would be ok not removing anything from the case. Just make sure the PC is lying in a way so that the motherboard is flat and the graphics card and CPU cooler are pointing upwards.
Wall outlet power meter
It is a bit close.
PC Part Picker puts you at 543 watts, although it is not the most accurate.
Are you planning to overclock the graphics card? Or just the CPU?
Since you have 46 amps on the +12V rail with the CX600, that gives you 552 watts to work with for the majority of your power consumption (CPU and GPU will use +12 volt power).
If you are not overclocking the GPU then I would not be particularly concerned. The R9 290x is going to be the biggest consumer of power, and at stock it shouldn't consume over 300 watts in gaming loads, which leaves plenty of extra power for the CPU and other components.
Also I definitely would not run Furmark or Combustor or other GPU stress tests if you do stick with the CX600, since they can drive power consumption significantly above what you can see in gaming.
If you really want to know you can buy this for ~$18:
With that you can read how much AC power your PSU is drawing from the wall and you can figure out if you are coming close to its capacity. I would suspect with the CPU only overclocked, you would be well away from hitting ~550 watts DC power consumption during gaming (that translates to around 650 watts from the wall due to efficiency loss).
You could use something like this
Just keep in mind that the PSU will have an inefficiency, so that if you're pulling 400W from the wall with an 80% efficiency you'd actually be using 320W (400*0.8) of your 450W max.
Yep, you plug it into the wall and then you plug your appliance/device to it and it will give you a measurement of the volts that it is using.
For example I had a DVR that was always plugged in and in stand-by mode while we didn't watch tv. It was using something like 8 watts, meaning that without using it it was wasting energy. All of these items around the house add up.
Another thing you can take a look at is the temperature of the fridge and freezer. You can probably lower the temperature and things will not spoil. That was a huge item at our house.
Here you go: [link]
This could be a concern because the i5 at 4.0 is probably around the 135 watt area under load as well that mean a total draw of 475 ish which isn't leaving a whole lot of room. Could be just a bunk psu in this case op /u/aero_enginerd . I dont know of any way to test the wattage draw besides the methods you have mentioned without buying something like a Kill a Watt.
Triangle is indeed positive, it shouldn't matter much because a switch doesn't get affected by the direction of current.
That said, I would start by checking if all the cables, graphics card, and ram are plugged in all the way. According to your motherboard manual, the motherboard has "EZ Debug LEDs" (pg 40 in the manual) if one of those leds light up that's the problem. If none of those light up, I would check that the powersupply is working either by using a spare powersupply to test and see if your build turns on or by using a monitoring plug like the Killawatt to see if the powersupply is drawing power from the wall. If none of that works, I would remove the graphics card and use the integrated graphics to see if its a dead card. If you have some spare sticks of ram I would try them to check for dead ram, also try the ram in the other two slots.
If it's none of that, I would assume its a dead motherboard or a dead CPU. However, your description sounds like a dead powersupply, so test that first.
Unlikely. Have you measured the power draw of your machine under full load, and then corrected for PSU efficiency? You'd need something like this (for some reason, I'm seeing the wrong product picture).
The original post indicated 5-10 players which is not the same as a turn on when you need it solution like yours. Electricity costs do wipe out any cost savings in a 24/7 operation unless someone else is paying for your electricity. Get a device like a Kill-a-watt: [link]
and you will become much more aware of what the true cost of electricity is for different devices. High-end computers are really pretty bad for electronics devices, not that this will stop me from getting GPUs that require 600 watt+ power supplies. Servers aren't so bad because they lack a GPU typically and have no need of a monitor. But I'd imagine for ARK, it would be running at high utilization instead of idling and therefore consuming more power.
I can't find my Kill-A-Watt right now, but some back-of-the-napkin math would suggest it should be about 70 watts total consumption. At 12 hours per day it should average about 25kWh per month, which, around here (WA), would be about $2.80 per month cost.
The best thing to do is find out exactly how much power it uses by using a watt meter ([link]). It is certainly doable. You'd need to find out power consumption, then determine how long you intend to play a day, and do the math to determine it's daily power usage. From their you can design a battery bank that will suit your needs.
850w PSU should be more than enough power especially considering the 900 series are notably more power efficient than previous cards. Get yourself a Kill-a-watt and plug your rig into it, crank up the benchmark software and see what you peek out at. I can almost promise for a single GPU it'll never got too far above 400w which is well within the efficiency curve of modern PSUs..
If your PSU is from an OEM build, it's probably built with cheap components that can't provide the rated wattage. If you want to upgrade your CPU/GPU, I suggest dropping the $30-$50 for a nicer PSU to make sure you get the power you need without frying any components.
Also, if you're interested it actual power draw, you can pick up one of these.
The AVR is going to use some power. You can buy a Kill A Watt Electricity Usage Monitor to determine how much electricity it uses. (that is a US model, you will need one suitable for your location)
Keep in mind that if you are heating your house, the waste heat is helping offset some heating costs.
Watt meters are fairly cheap- here's one similar to mine. You just plug your computer into the meter and plug the meter into the wall. It can also help pinpoint high-use devices; my portable AC is monster :(
I like having an individual one for each system that way i can monitor the power draw and also give them appropriate backup time depending on my needs. If you dont know for sure how much each thing draws then pick up a kill-a-watt and use that to determine load.
You can purchase this: [link]
Which will take the guess work if an (electrical outlet USB charger) charges the device to it's max output.
Like others said it cant be much more then $30 or so month. Its summer so ACs are on, Also you do have other devices in the house.
Get a Killawatt
For a more accurate measure
My 60" LED TV uses slightly less power than a 60W light bulb. Assuming yours is smaller, shouldn't be a big deal.
If you can't find how much juice it pulls in the original literature, you can buy (or rent) a kill-a-watt device that will tell you.
If you leave your PC on all day they do use alot of energy as portion of your total bill which may be why hes upset.
However, its nothing as a portion of the maximum power a 15 amp circuit can deliver (1800 watts)... Figure 300W for your CPU+GPU in game, 75W monitor, maybe 75W for extra stuff you have running on your desk, thats 450W or 25% of the circuits capacity. Some truely monster PC's can use more, if you have multiple GPU's for example. But honestly, its not your PC.
To test the amount of power your devices are drawing you could use a Kill-A-Watt. To test the whole circuit draw you would need an amp clamp inside the panel, which your family wont let you touch. Are there window air conditioners running in your house? Might they be plugged into the same circuit?
But since you say the house is new, like I said in the other thread, I think this has something to do with your breakers (should be arc fault interrupters?) and a specific device you have thats upsetting them.
You're probably not using your power supply to its max output but it's a good number to shoot for just to make sure. That means a ups with a minimum output of 600 watts (assuming you only plug in your computer). If you just want the bare minimum you can get a wattage indicator such as this one and find the exact max wattage your system uses under max load, but it's always better to go over than to kind of work like your current one. And don't forget to account for any other devices like monitors that may be plugged in.
Depends on the CPU, I would expect something like Ryzen to be worth it. You should get a watt meter so you can keep track of your electricity usage.
Fun Fact: Switching to LED bulb saves about 80 watts of electricity per light bulb!
Get a watt meter and measure it. Online tools cannot know your actual power draw because of optimisations etc. you may or may not have done.
Once you have your watts, you can calculate it easily
X watts * 24 hours * 30 days / 1000 = Y kWh in month
Y kWh * your electricity rate = Your rigs electricity cost per month.
Get a hold of an energy meter - something like this: [link]. This is for North America - so you need to find something equivalent for Europe.
Otherwise - just read the label on the microwave. The power output should be there. It should be close to the number you are looking for if you use the microwave on high power.
I've a evga ftw 1080 and 4670K @4.4GHz and I could get by with a good 450W (I've a 750W fwiw), I have a UPS which tells me the load, even though it might run the PSU on the warm side, that 450W will be fine with your system and a 1070, the only exception, is if you like have a bunch (8+) of mechanical harddrives, and even then you may be fine.
If you want to know for sure, buy yourself a Kilawatt they're like 20 bucks.
If it could do that, most people would return it!
Use a kill-a-watt to measure your Watts
My dryer consumes 700 watts while it in operation...
You can use a watt meter to measure its real power. Useful for everything. Sure it will work with a cheaper UFO. Why not? If it is only 1' x 1' sqft (30cm x 30cm)? I am not entitled to tell you something like it is a full size grow where i would say "Pls, Invest in better lighting."
It is a spacebucket and you can do whatever you want from one sinlge cheap 14W china LED to something smaller from KINDled, Advanced platinum or marshydro. You want better yield? Go for more power and quality product in the end. You just wanne experiment? Try and error with this, but the risk is there for a reason too. You just want to use it for reproduction in veg, or cloning? Works fine too.
For 110V you could try one of these, but this one is only a 15A version. There might be a 20A version out there somewhere. Maybe even a 220V version.
Leaving a phone charger plugged in will not damage the charger, but most AC to DC converters will continue to draw power if plugged in, even when not charging a device.
Using one of these and multiplying by your rate for electricity will tell you how much leaving it plugged in all of the time costs you:
Yes, the computer uses significantly more electricity when mining as compared to when it is idle. For example my computer uses 80 watts on idle, but uses 250 watts when mining (optimised), 320 watts if I don't optimise the mining.
You should get a watt meter and check your electricity usage yourself [link]
Hello fellow ambulance owner :-)
Do you know where the ambulance distribution panel is that distributes power to all of your lights and accessories? You will probably see the black and red power feeds coming from the vehicle main battery and alternator. That would be where you would connect your deep cycle batteries and feed from the solar controller. You will need to disconnect the vehicle main power and cap those connections off.
As for the power consumption of your laptop and gear, look at the electrical ratings for your power supply, for example, my laptop charger says it can provide 1.5A max. 1.5A x 120V= 180W max. To look at your specific gear, buy a power meter like [link] and measure the max draw under worst case conditions.
There are a bunch of switches setup in different miners to see this. If you want to see exactly how much you're pulling from the wall socket, get you a wattage meter.
I use this one:
Gives you a bunch more data that the switches in the miners don't give, like Kw/h, amps, etc. The amps is good information so if you run multiple rigs you can calculate if you are pushing the amps of your circuit breaker switch or not.
Works great with no issues and it was cheap.
I was planning on picking up this electricity usage monitor to see how much power my computer was drawing, when it occurred to me that I also have been wanting to buy a uninterruptible power supply as well.
Is there a UPS on the market right now that also monitors power draw?
If you want to measure electricity usage you can use a watt-hour meter. Not recommending this one but as an example [link] There are other types that can be clamped around the cord. Also there are a lot of online resources which will give you estimates of average usage for devices (and some that explain how to calculate if you are inclined to do the math yourself).
> also, can I use something like 12 strip power surge protector? on the battery generator? because most of them I see have 2 outlets + 1 of the weird looking one's outlet and for my needs i need more than 2 outlet
You can as long as you don't exceed the capacity and trip the breaker / fuse. Of course the more things it powers though the sooner the battery will run down (if you don't have it connected to solar or something else to keep it charged).
Simple math..... 1 kWh delivered costs about $0.17
1270 kWh times $0.17 / kWh = $215.9
What's hard to understand? Or you don't understand where you are using so much energy? You can find this out by buying a watt meter from Amazon or elsewhere ..
With the meter you can test almost all of your electrical devices using 120v [they make 220v ones too].
I have one and use it routinely.
You can buy one of these ^^^ meters to test stuff.
Really, they'll say: change to LED bulbs, get new appliances, turn off stuff you are not immediately using, etc.
They won't check their precious meter to see if its off though.
The draw of your GPU will by far be higher than the rest of the system unless. Still, if the machine is dedicated to mining, you may want to disable hardware you dont need in the BIOS. You should not assume the GPU will draw the max of the PSU. 1000W is enough to power several graphics cards. The best, to make sure you get a precise estimate, is to buy a power meter like this.
Sry, I realize I made a mistake in suggestion the link on my previous post also includes power consumption estimated. It does not.
I have a Kill-a-watt power meter which is a handy tool for checking power consumption. Some “smart” plugs and battery backup systems also have watt meters.
If you want to calculate the cost (e.g. to know how much money you are spending leaving something on) you will also need to know your own electricity price. National average in the USA is around $0.14 / kWh I think, but it varies WIDELY so check your own bill.
The 80% number is closer to reality for continuous running load. All of those "max" ratings you see on your electronics are not what your typical continuous load is at. If you want to get a better idea of what things are actually drawing, pickup a Kill-A-Watt for about $20 and do some snooping on your devices. You'll likely find that you're running well below the rated numbers are. It's a worthwhile $20 investment. You can use it all around the house to find out what things are drawing in this "there is no off" age of electronics.
I've actually been googling since mentioning them and realizing I should practice what I preach. It looks like a utility isn't readily available, but here are meters out there that a few forums suggest.
Something like this can give you an idea of how much you're drawing at any time and let you plan.
Or, if you don't wanna deal with another piece of tech, just get something way more than you realistically need.
Ah that makes sense. If you really want to check your power draw you can use something like a Kill-A-Watt to measure in real time what your computer uses. You can average it out and get an accurate representation of how much power you are using per kilowatt hour. Multiply times the cost of your energy and you see exactly how much you pay for your power on your PC.
Pretty handy device
This is your friend.
I meant to add that you can purchase a real time wattage meter such as the Kill-A-Watt: [link]
Just run a GPU and a CPU stress test to get the load on the cpu & gpu up to 100%/100%. Make a note of the total system power draw. Then run a test to stress only the GPU and make a note of the total system draw. Repeat this for the cpu. Average the three numbers to get a sense of your real world max draw. As long as that number is under 400 watts, you should be good.
Keep in mind that with over clocking, the TDP of the CPU or GPU doesn't scale linearly. This is due to the fact that you have to account for increases in frequency as well as cpu voltage. There's a formula for it: OC Wattage = TDP * ( OC MHz / Stock MHz) * ( OC Vcore / Stock Vcore )^2
Watts = Volts x Current (Amps)
900W = 110V x Amps
900W / 110V = ~8A
8A x 2 Miners = ~16A Total
16A < 20A
Should be fine, as long as there's nothing else on this circuit with the miners. If you are in a newer structure, the wires in the walls and outlet should be rated for 20A, especially if it was wired to 20A breaker. You can get one of these Kill-A-Watt electricity usage monitor if you want to keep an eye on it. It would just plug into the outlet before the surge protector.
Dude, buy the damn kill-a-watt. If your mining and already put in a few thousand bucks into a rig just pay the 20 bucks for the piece of mind. No need to be a cheapo about it.
Unless you aren't paying for power, a watt-meter is a good idea. Once I hooked my 1080s to one I noticed that undervolting my cards I saved so much in electricity that I actually earn more in net profit than pushing my overclocks.
I am at 60% power limit, +240 core, -300 mem on my cards. They sit at about 60 degrees and you should see only about 150-160w at the wall one one card, depending on your rig.
If you have the machine in house right now, you could buy this: [link]
Then you can, while running what you would normally run on it, get exact info.
I got the Ryobi version of this many years ago, unfortunately, it failed a few years after I bought it, but I was able to determine all of the power requirements of all my switches, routers and even my servers at the time.
You could grab a Kill-A-Watt for $20 and start plugging stuff into it to see what the wattage draw is. I'm guessing one of the computers is pulling a ton or the baseboard heating is actually hooked up to your meter.
I have an actual meter on it and I've gotten my power bill. Believe me it consumes 1200-1300 watts 24/7 all graphics card running OCed. The heavy duty extension cord I'm using is warm.
But you're saying the same thing someone has said, I might need a sub panel. Thank you for your advice. For now I might just ask them to do 3 20 amp breakers but my end of the year plan is to have 6 rigs running.
A typical residential circuit is 15 amps.
So you can draw 15*120 = 1800 watts from it. A little less than that, you shouldn't load a circuit to more than 80%, so 1440 watts.
Even small space heaters draw 1000+ watts. That's the reason it's tripping - that leaves under 440 watts for the TV, cable box, and your computer.
You need a gadget like this to tell you how much things really draw.
Just buy a Kill-A-Watt meter. It will answer all your questions and then a few you didn't even ask, 10/10 would recommend.
Power usage can depend on your usage of other appliances, the efficiency of the old PC, the usage of your new PC (as in, sleeping more than your old one) and many other factors. You can conduct your own testing using a Kill-A-Watt and get the hard answers.
It's not going to be running at it's rating all the time, or even most of the time. The best and cheapest way to get a good rough estimate you can work with would be to get a Kill-a-Watt and see how much it draw at idle, and how much it peaks, and you can extrapolate your avg power and energy from there. If you could find something with a graphing feature that would be ideal, but I don't know any off hand.
Yeah, parents suck sometimes. They think because they have authority they know what is right but it's not true.
I'm not sure what to tell you, best you could do is probably buy something like this and get 24 hours or more of data. If they still don't believe you then I'm not sure what else you can do except find a friend's house or something. Maybe you can find someone whose parents will let you run it for a small premium on the electricity?
Yes and no.. First of all, the hopper comes painted inside. The paint isn't heat proof (good job Masterbuilt) so it curdles and chunks up on the inside. Burn the paint out with a torch and scrub it clean with a scouring pad. This helps A LOT. Keep it clean by scraping and scouring after that. Second, the element burns a little too hot and causes it to burn up the hopper instead of it falling down. Get one of these.
You'll see the element pulls about 150 watts. If you hook it up inline with a dimmer switch, you can turn it down to about 75 watts. It will burn perfectly, make thin blue smoke, and not jam at that heat level. Just start it at full power and turn it down to 75 watts once it's going. I can get 7+ hours of nice smoke that way.
Edit: Don't use pellets in it!!! They get WAY too hot in there!
If you are convinced it is a power draw issue, you can do a couple things:
1) Replace PSU with something more efficient. I see yours is a 850w bronze unit. Power supplies operate at the highest efficiency (meaning the power it draws from the wall is closest to what the PC is actually using) around 50% load, so you want to figure out your system power under load, multiply that by 2ish and target that range. Lower if you want better efficiency at idle. This will be a very small change in overall power usage though, so I probably shouldn't have put this as #1.
To give a numeric example (assuming a bronze vs a gold 800w power supply with a system under 400w load), a bronze PSU might pull 470 watts from the wall while a gold draws 444w, a difference of 26 watts. Not much, but not nothing either. Better power supplies are often more efficient below the 50% range as well.
2) Replace lights with LEDs. You should do this anyways because they last forever, don't get very hot and draw next to no power. Also get in the habit of turning lights off when not in use if you don't already.
3) Self-audit other sources of power usage in your home. Maybe get a power-meter like one of these. Find out what is using too much power at idle and get in the habit of turning it off.
Try one of these to see how much electricity it's using.
Pick this up and you're all set. Mines been running plugged in for over 8 months now no problems
I hang dry everything except sheets and towels. When I use the dryer, it's all on the lowest heat setting possible to not damage the fabrics (the higher the heat, the more lint in the filter).
I live in Texas and we have pretty crappy air qualify so there's no chance I'll hang anything outside. The only time the humidity is a bit of an issue is around January-March, which is after the furnace stops coming on and before the AC kicks in.
We run a dehumidifier that time of year anyway since humidity is crazy high here all the time anyway, so what I usually do is to mode the dehumidifier to the room where the laundry is.
I haven't found a change in power from using the dryer (when I first moved in, I ran everything in the dryer) to hang drying. Basically, the dehumidifier or the AC will offset any of that use anyway.
You COULD pick up a little Kill-a-watt and measure how much power your dryer actually uses.
Hmm. Unless your dryer is a four-prong plug.
First off, don't take it in somewhere, this is a problem you should be able to solve on your own. Also, check to see if the light was blinking in a pattern to indicate which error it was, that would help greatly.
My guess is that running a fridge off an inverter 24/7 is too much for the battery capacity in less than optimal weather. Also, make sure you aren't parked in an area that receives shade.
Either way, find these specs: model of inverter, power drain of fridge, amp hour of battery, watts of solar
Report back and if anything is glaringly wrong with the specs i'll let u know
Tell the landlord you will spring for an outlet-level electric meter (here's an example) and that you will pay exactly the amount of electricity as shown on the meter, as calculated by LADWP current rates. Hell, offer to throw in an extra $5 monthly "I annoy the landlord" fee. It'll be way less than $100/mo.
I saw someone else recently ran their window AC unit consumption rates, and it came out to be like 50 cents a day.
I have a few of these in my house. I got one back in the day when I mined Ether for a few months (on a regular gaming computer, no crazy rig), and I liked knowing the exact wattage so much that I got a few more. Surprisingly cheap and effective.
Oh it’s not hard, just get one of these and plug it into your inverter and the AC into it: P3 P4400 Kill A Watt Electricity Usage Monitor [link]
Smart switches often measure power usage too, like the eve energy.
At any rate this will tell you the actual voltage your inverter is putting out and the kilowatt hours the AC consumes.
Watts are volts times amps. So if it says your ac used 600 kilowatt hours, then your 12V battery spent 50 amp hours to make it (600/12=50)
Watt hours is an objective way to discuss power consumption.
This is the one I use. You want to measure the Kilowatt Hours (KWH). I found it consumed a bit under 0.5 kwh to charge a bird from zero and the electric company in my area charges about 10 cent per kwh so that is less than 5 cents per charge.
Using electricity during specific time periods is only cheaper if your agreement with your electric provider specifies this. Here in Houston TX we can choose our retail provider, some have contracts where electricity usage during certain times is discounted or free but they charge a higher rate overall so it's usually not worth it.
Typical tips for reducing electricity usage would include turning off any lights you don't need, turning the thermostat up, using a programmable thermostat to only cool the house when people are home, checking your doors and windows to make sure they're not leaking, stuff like that. You could also use something like a Kill A Watt to measure how much electricity various devices are using.
I feel like everyone is getting hung up on how many watts you're using in your PC.
Take your actual or max power usage (max being just over 850w based on inefficencies/loss/etc for your PSU).
850w = .85 Kilowatts (which is how power is generally billed by the power company).
Next look at your power bill, it will show your charge rate per kilowatt-hour (or kilowatts used per hour) - aka KWH.
So if your charge rate is $0.50 (50 cents) per KWH and your PC is using 850watts (or 0.85 kw) per hour, then it would cost you $0.425 (42 and a half cents) to run your PC for 1 hour.
Now clearly, your actual usage/costs are going to be different, but this is how you would calculate it.
If you are serious about finding your actual usage, buy a kill-a-watt - [link] and hook your rig to it and then use your billed rate from the power co to figure out how much it is costing you.
Good advice from u/dogeatdawg but I figured a link to this would help too:
Kill A Watt Electricity Usage Monitor
All four are you basic desktop CPU machines, and my house is all electric.
That reminds me, I need to dig out my kill-a-watt and see what the wall draw is for each of them...
Yes. I used a kill-a-watt meter on the hub with just the headset. Probably not the most accurate readings but gives me a pretty good idea of the amp draw. Mine definitely pulls more than my motherboard's ports can handle.
One thing you can do is get an Electricity Usage Monitor and check the power usage of appliances by the kilowatt hour and make sure there is something that is not a power hog. This device also Calculates Electrical Expenses By The Day, Week, Month Or Year