> A bit like how java people insisted on making naive getFoo() and setFoo() to pretend that was different from making foo public
But it's absolutely different and sometimes it really matters.
I primarily work with C# which has the "property" member type which is essentially a first-class language feature for having a get and set method for a field on a type. What's nice about C# properties is that you don't have to manually create the backing field and implement the logic to get/set it, but you still have the option to do it at a later time if you want.
When you compile C# code (I expect Java is the essentially same) which accesses the member of another class, the generated IL/Bytecode is different depending on whether you're accessing a field, property or method.
This means that if you later find it would be useful to intercept gets or updates to a field and add some additional logic for some reason (e.g. you want to now do lazy initialization), if you naively change the field to a method/property (even with the same name), existing code compiled against your original class will now fail at runtime with something like a "member not found" exception. Consumers of your library will be forced to recompile their code against your latest version for things to work again.
By having getters and setters, you have the option of changing things without breaking existing consumers of your code. For certain libraries or platforms, this is the practical difference between being stuck with certain (now undesirable) behaviour forever or trivially being able to change it.
Adding lots of code for the common case to support consumers of the code not recompiling for some uncommon potential future corner-cases seems like a bad deal.
In a product world where customers are building on your platform, requiring that they schedule time with their own developers to recompile everything in order to move to the latest version of your product is an opportunity to lose one or more of those paying customers.
These customers would also be quite rightfully annoyed when their devs report back to them that the extra work could have been entirely avoided if your own devs had done the industry norm of using setters/getters.
Maybe you're not a product but there are various other teams at your organization which use your library, now in order to go live you need to coordinate with various different teams that they also update their code so that things don't break. These teams will report to their PMs how this could have all been avoided if only you had used getters and setters, like the entire industry recommends.
Unless you're in a company with a single development team building a small system whose code would never be touched by anyone else, it's a good idea to do the setters/getters. And even then, what's true today might not be true years from now.
Maybe I don't understand as I'm an outsider, but as per my recent comment on this topic [0], I fail to see the logic of how "other countries" pay the US when the tariffs are paid by the importer and not the other country which is exporting.
I do acknowledge that import taxes can in theory help local industries, especially if the other countries are subsidizing exported goods.
Tariffs almost never make sense, unless it's an industry that's super important for your own survival.
Capitalism is about efficiency, and eventually there are going countries where producing certain items will always be more efficient. East asian countries have spent decades innovating and investing in their manufacturing capabilities.
Also, one thing that grinds my nerves are the narratives of trade balances that only focus on physical goods but conveniently ignore services.
US exports trillions in software, ai, music, videogames, financial services, cloud, and that's conveniently ignored.
Eventually tariffs come back biting those who issue them, because the moment your local industries don't need to compete anymore to survive, they have no incentives to innovate.
It's not that it's conveniently ignored. It's that a services economy, while lucrative in the good times, leaves you lacking self sufficiency and resilience. We should never have let manufacturing leave to the extent that we did, all in the name of efficiency over all else.
Of course it goes without saying that launching an absurdist comedy interpretation of a global trade war is not the way to fix the problem.
I don’t have a source for this handy, but I believe that in terms of goods produced, manufacturing is actually still alive and well in the US. It’s just that this is done by a much smaller and more automated workforce.
The importer pays directly. There are three ways the importer can deal with the burden of that. In most cases it will be a combination of all three of them.
1. Raise the price they sell the imported item for.
2. Eat it.
3. Lower the price they are willing to pay the exporter.
For the Trump tariffs it has been overall it has been about 90-96% #1 and #2 and 4-10% #3. I haven't seen a breakdown of how #1 and #2 is split.
Next time you wonder why a Trump supporter has a bad argument, remind yourself there was a nonzero number of them who literally drank bleach and Lysol after he told them too.
I like how you're being downvoted for pointing out an objective, documented fact. [1]
Of course an important corollary is that Trump did not, in fact, tell anyone to drink bleach or Lysol. His supporters were stupid enough to do it on their own initiative, at Trump's mere suggestion that it was worth looking into.
> I fail to see the logic of how "other countries" pay the US when the tariffs are paid by the importer and not the other country which is exporting.
The "logic" is/was that this was a lie directed at his "low information supporters" who tend to simply "believe" whatever he tells them without question. Those same supporters would have been very much against having a "tax increase" levied upon them, but so long as he lied to them and told them "the other country pays the tariffs" then they were fooled into not understanding the tariffs were just a tax increase and so were "in support" of the tariffs.
That was the sole logic -- although there have been times when I've seen news blurbs that have made me wonder whether Trump himself actually believes his own lie about "other countries pay us" in regards to tariffs.
I always thought of outlook.com as a rebranding of Hotmail (which itself had been continually evolving, was probably actually “Live” at that point), I would expect it is the same (ever evolving) infrastructure.
In which case, people like me with an @hotmail.com address from the 90’s were much earlier users of the outlook.com email boxes than when the domain was “launched” by Microsoft.
I feel the greatest trick of American politicians is that the term “tariff” tends to be used by most people instead of “import tax”.
I live in South Africa and we have significant import taxes on certain kinds of items, but nobody (aside perhaps from economists/accountants/tax practitioners, etc) calls them tariffs.
It’s not the overseas seller who pays extra for the item to be imported, it’s the importer, paid to the tax man. It’s a tax paid on imports. That cost is ultimately passed on to end buyers, such as myself. Why would I generally refer to it as a tariff except for reasons of pedantry?
You can’t change anything about a commit without breaking the chain of SHA hashes in the commits, which causes pulls to break.
GitHub hides the emails on their web UI, but nothing stops people from pulling the repository with a Git client and looking at the emails in the commit log after doing so.
Which is why you should be careful to never use your actual email in git commits.
When I made a patch to the Linux kernel I did have to use a real email, since you have to send to their mailing list. I used a throwaway email for it, which I have since edited on my mail server config to forward to /dev/null (yes, I'm one of the weirdos still self hosting email in 2026). The amount of spam I got was insane, and not even developer relevant spam.
Although their stated reason for hoarding is that they "really need it", I think it was a strategic move to make their competitors' lives more difficult with little regard for the collateral consequences to non-competitors, such as regular people or companies needing new computers.
Absolutely, “ORM == bad” viewpoint strikes me as highly ignorant of all the benefits they provide, particularly in statically typed languages.
People like me don’t choose an ORM to save me from having to learn SQL (which you’ll still need to know), it’s because 99% of the time it’s a no brainer as it vastly increases productivity and reduces bugs.
In a language like C#, EF Core can easily cover 95% (likely more) of your SQL needs with performance as good as raw SQL, for the small percentage of use cases its performance is lacking, you fall back to raw SQL.
But if saving you from writing 95%+ of SQL queries was not compelling enough, it’s just one benefit of EF Core. Another major time saving benefit is not having to manually map a SQL result to objects.
But an often super underrated and incredibly valuable benefit, especially on substantial sized code bases, is the type safety aspect. The queries written using LINQ are checked for free at compile time against mistakes in column or table names.
Want to refactor your schema because your business domain has shifted or you just understand it better than before? No problem. Use standard refactoring tools on your C# code base, have EF Core generate the migration and you’re done in 10s of minutes, including fixing all your “SQL queries” (which were in LINQ).
EF Core is almost always a no brainer for any team who wants high quality and velocity.
I am occasionally called upon by the local consulate to perform my civic duty and vote.
Just this week I sent them back my ballot, now marked, for this referendum in a sealed envelope.
This referendum required me to dig more deeply than usual into Italian politics before I could decide which way I wanted to vote.
reply