A utilitarian effective altruist absolutely should defraud anyone not already on their team.
They precommit to others who are in on the take that 90% of the profit will go to the Cause, and then all that excess that was gonna be wasted on luxury or low-QALY retirement/dying years will instead go to places that generate more utils.
I wouldn't be shocked if a document (or, more likely, testimony in a trial) eventually turns up explicitly saying that.
(To be clear, I think the plan I have described is reprehensible, but that's because I'm not a utilitarian.)
And what might we expect the second order effects of “EAs are philosophically committed to fraud” be? Like this only looks rational if you kind of take a first approximation of a dynamical system and call it good.
That's weird-shouldn't he be doing everything that's possible to go to jail? Everybody would like to go to jail.