and it came back
Mar. 13th, 2021 04:49 pmSo after months of good behaviour, the server closet started tripping the combo AFCI/GFCI breaker again. We fucked around with some shit, and it kept doing it... until we pulled the Comcast equipment out as part of an upgrade, and it stopped.
For weeks. They took out the battery-backed-up phone modem (with internal UPS) and old cable modem and replaced them with a single non-battery-backed-up both-functions modem and it stayed stable for weeks... until today.
I mention this battery for reasons. Bear with me.
Similarly, I note that the Comcast tech doing the upgrade was very surprised when I said the two units drew a combined 25 watts. That's much more than it should've been drawing, but it would draw that on the regular.
And also, the phone never stayed up when we lost power, despite the supposed built-in phone modem UPS.
This, too, I mention for reasons. Continue to bear with me.
Now during this functioning time, over these last weeks, I'd set up a test. This test consisted ofinstalling an AFCI outlet in the server room. The intent was to see what would happen the next time the circuit tripped: would the AFCI trip too? Or would the breaker trip, while leaving the AFCI untripped?
As of a few hours ago, I know that it leaves the AFCI untripped. Which means it's tripping on GFCI, and not on AFCI.
When the circuit finally tripped today - having gone from "every few hours" to "weeks" after the Comcast pulled their modem with the fucked-up built-in phone UPS - multiple of our regular UPSes showed one battery charge bar under full charge immediately.
In theory that wouldn't happen so quickly - nothing we have drains those UPSes that quickly. But it nonetheless happens sometimes. This is because while lead-acid batteries hold charge pretty well, they don't hold them perfectly or indefinitely, and they do decharge while on standby.
To deal with this, UPSes will top the batteries off every so often. Or, depending on the model, they'll trickle-charge them constantly to keep them most of the way up, boosting the charge rate as needed.
I think when enough of them go to this mode in concert, that's the trigger for false GFCI hits.
And when one UPS is fucking up (e.g., the phone modem's completely-failed UPS), that makes that coincidental lead-acid charge hit a lot more likely. And it's possible - given that the battery would fail instantly - that the Comcast modem was in this state at all times.
Hence, hours between trips, once it finally completely failed.
Various parts of this theory are not unique to me. While manufacturers don't admit this GFCI tripping problem outright, sometimes they kinda do (see previous posts) and it's generally kind of known to be an issue.
But I now know a couple of things:
So I've installed the GFCI outlet where GCFI is required, kept the AFCI outlet where AFCI is required, and put the plain breaker back in the box where it came from to begin with.
Let the UPSes generate their bullshit false GFCI hits all they want to. Nobody's listening anymore. And the garage outlet can protect against GFCI down there - where it's actually needed - without bringing down the whole circuit.
And let my servers have their power. For the love of fuck, team. Let them have their power.
I don't know if this is fixed. I sure fucking hope it is, but I don't know.
But I had a hypothesis based on what I know, and when I finally had an incident... it validated my hypothesis.
Let's see how things act now.
For weeks. They took out the battery-backed-up phone modem (with internal UPS) and old cable modem and replaced them with a single non-battery-backed-up both-functions modem and it stayed stable for weeks... until today.
I mention this battery for reasons. Bear with me.
Similarly, I note that the Comcast tech doing the upgrade was very surprised when I said the two units drew a combined 25 watts. That's much more than it should've been drawing, but it would draw that on the regular.
And also, the phone never stayed up when we lost power, despite the supposed built-in phone modem UPS.
This, too, I mention for reasons. Continue to bear with me.
Now during this functioning time, over these last weeks, I'd set up a test. This test consisted ofinstalling an AFCI outlet in the server room. The intent was to see what would happen the next time the circuit tripped: would the AFCI trip too? Or would the breaker trip, while leaving the AFCI untripped?
As of a few hours ago, I know that it leaves the AFCI untripped. Which means it's tripping on GFCI, and not on AFCI.
When the circuit finally tripped today - having gone from "every few hours" to "weeks" after the Comcast pulled their modem with the fucked-up built-in phone UPS - multiple of our regular UPSes showed one battery charge bar under full charge immediately.
In theory that wouldn't happen so quickly - nothing we have drains those UPSes that quickly. But it nonetheless happens sometimes. This is because while lead-acid batteries hold charge pretty well, they don't hold them perfectly or indefinitely, and they do decharge while on standby.
To deal with this, UPSes will top the batteries off every so often. Or, depending on the model, they'll trickle-charge them constantly to keep them most of the way up, boosting the charge rate as needed.
I think when enough of them go to this mode in concert, that's the trigger for false GFCI hits.
And when one UPS is fucking up (e.g., the phone modem's completely-failed UPS), that makes that coincidental lead-acid charge hit a lot more likely. And it's possible - given that the battery would fail instantly - that the Comcast modem was in this state at all times.
Hence, hours between trips, once it finally completely failed.
Various parts of this theory are not unique to me. While manufacturers don't admit this GFCI tripping problem outright, sometimes they kinda do (see previous posts) and it's generally kind of known to be an issue.
But I now know a couple of things:
- It's tripping on GFCI, not AFCI. Or I should say, at least did in this case.
- This outlet does not require GFCI to meet code. Only AFCI. The other outlet in the circuit requires GFCI, but not AFCI.
- I'd already installed an AFCI fixture in the server room, where it is needed.
- I already had a GFCI fixture ready to install in the garage, where that's needed.
- I still had the old 20A plain breaker.
So I've installed the GFCI outlet where GCFI is required, kept the AFCI outlet where AFCI is required, and put the plain breaker back in the box where it came from to begin with.
Let the UPSes generate their bullshit false GFCI hits all they want to. Nobody's listening anymore. And the garage outlet can protect against GFCI down there - where it's actually needed - without bringing down the whole circuit.
And let my servers have their power. For the love of fuck, team. Let them have their power.
I don't know if this is fixed. I sure fucking hope it is, but I don't know.
But I had a hypothesis based on what I know, and when I finally had an incident... it validated my hypothesis.
Let's see how things act now.