There's intent, deception, and damages so it's definitely fraud. This isn't a mundane matter of creatively using someone's API in a way they don't like. He came up with a scheme to extract money from them. The ToS is the contract governing payments in this case (IIUC).
It's the difference between violating a no skateboarding sign in front of a shopping mall versus a no trespassing sign at a military base. They're both "just signs", right?
Oh I don't deny what he did is most likely a ToS violation. And under those terms, he should probably be forced to pay back the money.
But I don't see how it's fraud in the criminal sense. That's just my judgement as a citizen, not a lawyer. All I see is the shopping mall shaping criminal law to its own benefit.
As for the military bases, yeah, stay away from those, kids.
The point is that it doesn't matter what you call the contract. You're thinking "oh that's just a sign" (ie ToS). Your error is that not all signs are equal. The 8 million dollars is the military base in this analogy. Being prosecuted for violating this ToS under these conditions is not interchangeable with others.
Fraud is just any time you intentionally deceive someone for material gain. Even without the ToS this would presumably still qualify as fraud. The ToS just makes it more straightforward to argue (IIUC, IANAL, etc).
A decent rule of thumb is that if your hack or neat trick results in money in your bank account that the other party wouldn't have paid out to you had they been aware of what was happening then you are almost certainly committing a felony of some sort.
I guess forging documents and selling you a house which I don't own shouldn't be an actual crime either? The patterns of behavior in both cases are functionally indistinguishable.
Ha, I remember doing this with my Apple //. I forget what I was doing, but realized if I could set a pixel and later get what color was drawn at that location I could use it as a big array. Didn't know about peek/poke yet. One of those core "computers are magic" memories.
Fuck that. I just left a job where the IT dept just said "yes and" to the executives for 30 years. It was the most fucked environment I've ever seen, and that's saying a lot coming from the MSP space. Professionals get hired to do these things so they can say "No, that's a terrible idea" when people with no knowledge of the domain make requests. Your attitude is super toxic.
I suppose I understand why devs who don’t know how to say no, or work with stakeholders, are terrified of AI. What value do you have, at this point, when you’re unwilling to or incapable of pushing back on bad ideas?
You'd have to define a "bad idea" much more precisely and in the context of that particular business.
Developers often do push back and warn against ideas that have too many compromises, but cannot outright say no just because of that. There are too many other people involved.
You seem to think that any one person/group has/wants/should have full control when deemed necessary. That doesn't make sense unless either the success criteria are lacking (you call the shots alone and probably miss a ton of opportunities), or the requirements are so constrained that all the work is just optimizing the implementation (someone else already called the shots without you).
If your work is either of those situations it means the business plan sucks. AI is the least of your worries.
* I want to be clear, I'm using "you" in the general sense - apologies if it reads as accusatory.
If you lack the ability to say no to objectively bad ideas, you have very little value as a developer. Anyone can code a feature just because someone said to code it (Claude builds trivial objects for me every day when I know what I want but can't remember the specific syntax or pattern to do it). It takes actual skill and expertise to both recognize bad ideas, and convince people they're bad ideas.
> You seem to think that any one person/group has/wants/should have full control when deemed necessary.
No, I think subject matter experts should function as experts and should have decision-making power within their scope of expertise - if they're unable to convince others, then they are ineffective and should be replaced by SMEs who are effective.
I don't understand why you think a committee approach and implementation of bad ideas, regardless of what the experts in the room think, is an optimal business pattern.
Are you in an area with a bad electrical grid or something? In 40+ years I've never had a single device get fried from a surge/storm. My "surge protector" power strips are from the 90s and probably don't even work.
This. Same timeframe and I've lived through both lots of lightning storms and in areas with lots of power failures. Some of them intermittent and essentially caused by transformers blowing up. Like earlier this winter, we had multiple storms where you'd hear a transformer blow up, in many cases even seeing the sky light up as well from it, power going out, couple seconds, power coming back, next transformer blowing out, rinse, repeat.
On the other hand I've read about plenty of stories of the "cheap" UPSs you'd usually buy as a consumer (not to name any brands coz I've never had any) actually causing such issues in the first place. Without any actual surges from the grid.
That said, being totally not superstitious (for real, but someone's gonna "kill me" if they find out I wrote this and something dies from a surge...), now I guess I need to knock on wood like seventeen times ...
I do use surge protectors when we're on generator power temporarily.
The things people often call "transformers blowing up" are usually not transformers blowing up.
Instead, it's usually just overhead wires that are too close or literally touching, often from influences like wind and ice. The electricity arcs between the wires, creating bright blue-white flashes that can be seen from far away, sometimes with instantaneous heat that makes hunks of metal wire evaporate explosively. It can be violent and loud, and repetitious as different parts of even a single run fail.
Transformers can certainly blow up, but that's less common. They're (generally) filled with oil for cooling purposes, and they're massive things that tend to take time to get hot. A failed transformer can produce arcing and blue-white light, but if things are that hot then the oil is also ready to burn.
And when the oil burns it isn't blue-white -- it burns with about the same yellow-orange color we saw the last time we accidentally flambéed dinner on the kitchen stove, or a Hollywood fireball.
A bright flash without a fire is probably not a transformer.
Haha, I hear you. But yes, it really is transformers blowing up sometimes. Sometimes it really is just branches blowing up the line, sure.
A branch hitting a wire, happenes all the time here too. Lots of trees in this community. The video of a transformer you shared: that's not the transformer I'm talking about. That's at a transformer station.
And yes, I know it's transformers and not just wires (but also wires do happen definitely) coz I do walk the neighborhood regularly and I can tell when a transformer is new vs. old up there. Ours is old. The ones a few streets over sometimes are very new and I see the Hydro trucks go by the next day(s) to make them new ;)
Again, like seventeen times knock on wood but the ones next to us have not actually blown up. But three streets over, seen the new ones. Literally last weekend, we had an ice storm come through and while no blowouts we could see or hear, the outage map showed plenty of failure.
Residental-scale transformers can and do explode. Shorts happen not-infrequently with freezing rain and ice storms especially causing issues - the internal oil gets displaced by the water, and the dirty water causes an internal short. It wipes out power to a few blocks here when it happens, but we get an outage due to it every year or two.
But when the wind is whipping along on a warm day and there are bright flashes and audible bangs, that's (usually!) not signs of transformers blowing up... even though the popular vernacular often erroneously describes it that way.
It happens. The power company was very unhappy with my boss for destroying one of their transformers. The thing is while circuit breakers react very quickly to extreme overcurrent situations (shorts) they're much slower to react to loads which are only a bit over the limit, and if short enough won't react at all. Very common with heavy motors.
And that's exactly what the problem was--we had a whole bunch of really heavy motors. Getting ready to start for the day you flip on the switches and the big machines start to spin. The transformer on the pole was rated higher than the main breaker for the plant--but the transformer apparently was more sensitive to the temporary loads. Once the problem was identified it was resolved by staging it, instead of flipping them all on they were flipped on over 5 minutes.
It's not just cheap UPSes, it's cheap surge protectors as well. They exist because the vendor can throw in a MOV costing a few cents and increase the price of the power strip by 50%, not because they're any good. MOVs are sacrificial components which have either degraded to uselessness by the time they're actually needed or, if they're still working, can explode or catch fire from the energy dissipated. Even if they don't, all they're doing is converting an x-kV spike on active into an around-x-kV spike on neutral or ground. If you want to do it properly, use a series tracking filter, not a "surge protector".
One scenario: there's a short circuit somewhere, say rats chewing through insulation. This can cause a very high current through the short. A non-inverter 4500 watt 120 volt generator might have 0.2 ohms coil resistance, so the short circuit current can hit 170 volts / 0.2 ohm = 850 amps. When the shorted branch's circuit breaker trips, the inductance in the generating windings wants to keep that 850 amps flowing for at least a few microseconds, and it gets distributed across everything else that's still connected. Depending on what else is connected (hopefully including some surge protectors) the peak voltage can get into many kilovolts.
The circuit is something like this:
voltage source -- parasitic inductor --+- circuit breaker -- short
|
+- circuit breaker -- your PC
More generally, for the previous poster, look at what happens when a magnetic field collapses suddenly, you can get kilovolt spikes. There's probably a ton of YouTube videos demonstrating this in various ways, it sounds like the sort of thing that Electroboom would do. Normally this is handled via snubber circuits which dissipate the energy before it can do anything, but in exceptional cases it could end up going where it shouldn't.
Definitely use quality surge protectors on expensive equipment connected to generators.
PSA: UPSes and GFCI/GFI extension cords won't work properly when connected to a stand-alone generator with a bonded neutral. I've tried using enterprise UPSes on such generators, but they absolutely won't work. In such scenarios, debond the generator's ground from neutral, apply a very large warning label to it being debonded, and drive a massive ground rod electrode into the ground as close to the generator as possible and ground the neutral there. This does work and is much safer because there's a stable voltage reference source. It's more of a hassle but can be necessary for some off grid and temporary scenarios.
GFCI works correctly either way. Their operating mode doesn't care at all about ground: Whether bonded, not bonded, or not even present (look, ma! only two wires!), they still perform the same way.
They respond to an imbalance in current flow betwixt line and neutral. What goes out must return; if it doesn't, then switch off.
> In such scenarios, debond the generator's ground from neutral
eeeeep. Please for the love of all that is holy, CONTACT AN ELECTRICIAN before messing around with that - or before creating a ground bond where none should be (i.e. TT grid [1]). You may end up endangering yourself if you do not exactly know what you are doing - in the case of TT, you get ground potential difference current from other parts of the grid flowing to ground via your generator's bond. Best case you're getting problems with electrochemical corrosion (including in your foundation), worst case enough current flows to turn your bond wire into a thermal fuse.
Also, take great care if your grounding is provided via municipal water service, or if your original grounding rod has dried out to the point it's ineffective.
Let me repeat: LET ELECTRICIANS DEAL WITH GROUNDING AND SURGE PROTECTION. Floating grounds and improper ground connections CAN BE LETHAL OR POSE A SERIOUS FIRE RISK.
AND YES THAT INCLUDES "ISLAND" SCENARIOS OR EMERGENCY POWER INPUTS (e.g. via CEE plugs and transfer switches).
I'm not sure I'd leave something like this to an electrician. Or if so at least make that electrician be experienced in this field. I think you'd want an electrical engineer to be involved with the plan to some degree.
Electrical engineers don’t know code requirements and wiring guidelines for household electrical wiring. They’re absolutely not the correct default. Electricians with specialization in generator setups, sure, but an electrician engineer on average is likely going to be more uninformed on code requirements than an electrician.
Electrical engineers know the theory but lack the practical knowledge which grid form is used at your specific address (yes, here in Germany we have a few towns where one half side of a street runs TT and the other one is already migrated to TN-C or TN-C-S).
An electrician specializing in lightning protection, uninterruptible power installation or in radio installations can sort out all of that far better than an engineer can.
That's an extreme edge-case and a strawman. Anyone operating temporary equipment on a generator during a severe storm will obviously unplug sensitive stuff to not take unnecessary chances regardless of safety precautions already in place.
Ground rods are required in certain situations according to the NEC.
Ground rods are for lightning protection, transient surges (over voltage), and induced surges; not for short protection, ground faults, or making ordinary extension cord use of bonded generators "safer".
Typically, they're required whenever it's a system that powers a building on its own, i.e., off-grid setup or with a floating neutral generator connected via a switched neutral transfer switch.
You can unplug everything and open all the switches, but a nearby lightning strike will still fry your generator through that unbounded ground rod. Lightning ground potential is very eager to take the shortcut to your other ground rods through a few millimeters of insulation and open switches on the path through your generator and house wiring, when the alternative might be tens of meters of dirt :)
I don't care what the NEC doesn't say, NFPA 780 says you have to bond all ground rods.
When I lived in Costa Rica, I lost three surge protectors in a year to power surges. During one such power surge, I didn't notice that the red light indicating surge protection was already out, and a power surge fried my (knockoff) Macbook power adapter, leaving me without a way to work for a day.
Not only should you get rid of them, but also they are a fire hazard.
Also, do not accidentally plug surge protectors into each other, metal oxide varistors can star fires _without_ meaningful surge conditions when you do so.
I prefer to buy products without MOVs entirely due to the risk, with the exception of one, Tripp Lite Isobars; but I prefer to use series mode protectors such as Brickwall or SurgeX.
> Not only should you get rid of them, but also they are a fire hazard.
Are they not a fire hazard even when new? MOVs do tend to degrade with use (especially after they've gone conductive to snuff one or more surges). But AFAICT we can't really know, without potentially-destructive testing, whether a given MOV is in good shape -- whether installed last week, last year, or 30 years ago.
> Also, do not accidentally plug surge protectors into each other, metal oxide varistors can star fires _without_ meaningful surge conditions when you do so.
What is the mechanism that increases risk for MOV-sourced fires in this arrangement?
I've also noticed that many of the power supplies I've taken apart (for very pedestrian consumer goods) have internal MOVs on their line input. Whatever the mechanism is that increases risk, isn't using one external surge protector already doing that in these instances?
> I prefer to buy products without MOVs entirely due to the risk, with the exception of one, Tripp Lite Isobars; but I prefer to use series mode protectors such as Brickwall or SurgeX.
I prefer to avoid MOVs, too. Broadly-speaking, diodes seem like a better way to do it. (Transtector is another reputable brand that uses diodes.)
---
That all said, I've noticed over the years that problems with dead (presumed-to-be-hit-by-a-power-surge) electronics tend to follow particular structures. And the reason for this seems related to grounding more than it is anything else.
So when I find someone (a friend, a client, maybe someone online that I'm trying to help) complaining about repeated damage, I often ask about grounding. Almost always, it turns out that they've got multiple grounding points for the electronics: The electric service has one ground rod, and the telephone/cable feet/satellite/whatever is connected to some other ground.
This might be a dedicated rod, maybe a metal pipe; whatever it is, it is distinct from the main service ground. It happens all the time. (It is worth noting that the NEC prohibits this kind of configuration unless extraordinary effort is put forth. See 800.100(d), for example.)
The way that MOVs -- and avalanche diodes alike -- behave combines with the fact that the earth is an imperfect conductor, such that having multiple ground points promotes dynamic ground loops that can provide quite large potential -through- the electronics that we seek to protect.
The problem appears suddenly, and repetitiously. Everything is fine, and then ZANG: The cable modem gets smoked along with the router it is connected to. So the modem goes back to Spectrum or wherever to get swapped, and the router gets replaced again, until the next time: ZANG.
TV connected to satellite receiver, with coax incorrectly grounded? ZANG. Over and over again.
I'd see it all the time when I was a kid back in the BBS days: The phone line was grounded improperly, and computer was the only thing that connected to both electricity and the telephone line. Some folks would go through several modems over the course of a summer, which was very expensive -- while most people had no problems at all. Next-door neighbors would have completely different failure rates.
Structures with correct grounding tend to do very well at avoiding these issues, and I've fixed these conditions in subsequent years more times than I can count.
(A coworker installed a phone system at a business once, wherein he made extensive use of Ditek surge suppressors -- on the incoming POTS lines, and on the power inputs. It blew up one day. So he called Ditek to try to get at least the cost of the phone system hardware covered. They asked him to draw up a map of how the building was grounded and send that over, so that's exactly what he did. When they saw his map, they very quickly identified a ground loop and denied the claim.)
"What is the mechanism that increases risk for MOV-sourced fires in this arrangement?"
I wondered the same thing, and failed to find a satisfying explanation.
I can find plenty of reports of MOV fires, especially in situations where there's a persistent over-voltage, e.g. a 120 V site actually having closer to 240 V due to a floating neutral. But I don't see how chained MOVs make that worse in general. This blog post has some nice photos:
No clue about the actual reliability of this[1] article but the mechanism mentioned (new pathways due to changes in crystalline structure due to uneven heating) sounds possible.
Heavy industry can also cause these kinds of power surges to happen.
Last year an aluminum smelter in Iceland had a transformer blow which caused a big power surge on parts of the very well developed national power grid. The surge caused damage to electronics in some households and companies near to the smelter.
EE living in a rural location here: transient related failures do happen in my experience. Rare but they happen. And I've known of people who had everything in their house fried. For me it's just been a couple of Ethernet ports. Power strips don't provide much protection fwiw. Always worthwhile checking that your electrical service is properly grounded.
I lost an audio mixer to a bad surge last year. I don't know whether it was additional load or just really bad fluctuations that damaged the device. Nothing else bit the dust, but the, digital board in this mixer got bricked.
Just saw this days later - I'm not sure if it was a surge specifically, but it got bricked after a really wacky power swing. So maybe not a surge, exactly.
I live in Oakland, and the Easy Bay has had its share of random outages without incident. The last outage was just a split-second blip, but it broke a $1k computer monitor.
Another perspective: we should install whole house surge protectors if we can afford them, not only for ourselves, but to help our neighbors - even if in reality the help is minimal and they need their own as well. In the best case scenario, if everybody in a neighborhood has them, each individual house will be more resistant to surges than if they were the only house with one (five houses with surge protectors nearby is a lot better than one) - everybody wins.
Why? If the voltage spikes on the grid (that's what I understand a power surge to mean), wouldn't even more of it end up in your house (that is: the grid voltage spike even higher) if the neighbors have equipment that doesn't let their devices consume some of that energy?
Edit: wait, maybe I figured it out: those devices must be consuming the excess rather than blocking it. Is that it?
Yes, energy dissipates. Although one still needs to look out for the distance from the neighbours as your ground can be different from your neighbours ground potential.
Yes, that puts it down perfectly. That’s why some don’t ever see the benefit of installing their surge protector whereas others install one way too small for their situation and find them useless anyway.
Not OP but I had an XML file with inconsistent formatting for album releases. I wanted to extract YouTube links from it, but the formatting was different from album to album. Nothing you could regex or filter manually. I shoved it all into a DB, looked up the album, then gave the xml to a local LLM and said "give me the song/YouTube pairs from this DB entry". Worked like a charm.
Missing the point. I no longer need to buy or rely on someone else for software I want to use. A lot of things I want to do ARE one offs. I can write software and throw it away when I'm done.
I know this sounds sarcastic but I really mean it: For years everyone has been monastically extolling some variation of "the best code is deleted code". Now, we have a machine that spits out infinite code that we can infinitely delete. It's a blessing that we can have shitty code generated that exposes at light speed how shitty our ideas are and have always been.
How can you possibly be this confident if you don't know the number of times Firefox was run and number of bug reports submitted? Say it's run 100,000,000 times, 1000 reports are submitted, and 10 are bit flips. Seems reasonable. You're misinterpreting what they are saying.
10% of 1000 isnt 10; its 100.And no, its not reasonable - the main reason is that you cannot reliably tell if something is a bit flip or not remotely, because bitflips affect both code and data. Also, 10% of a semi-obscure specific category of failures seems to indicate that the population submitting crashes isn't random enough. I'm a layman in statistics, but this doesn't seem correct, at least not without concrete details on the kinds of bugs being reported and the methodology used. Claiming 10% and being able to demonstrate 10% are different things - and the tweet thread indicates that is this clickbait - something in the lines of "may potentially be a bit-flip". Well, every error may be a bit flip.
IP doesn't handle roaming very well. If you got routed onto the internet directly from your local cell tower, then your connections would drop whenever you switched to a different tower, which is somewhat suboptimal. Cell networks handle it at a lower level and route your traffic through a central location which serves as the origin of your IP traffic. Geolocate your IP while on cell data and you'll probably see something pretty far away from where you are. My phone's IP address at the moment is about 400 miles away from the actual phone.
reply