I’ve seen forms of this joke quite a lot in the last few years, and it never fails to make me laugh.
I’ve seen forms of this joke quite a lot in the last few years, and it never fails to make me laugh.
I’m gonna hazard a guess, just cause I’m curious, that you’re coming from JavaScript.
Regardless, the answer’s basically the same across all similar languages where this question makes sense. That is, languages that are largely, if not completely, object-oriented, where memory is managed for you.
Bottom line, object allocation is VERY expensive. Generally, objects are allocated on a heap, so the allocation process itself, in its most basic form, involves walking some portion of a linked list to find an available heap block, updating a header or other info block to track that the block is now in use, maybe sub-dividing the block to avoid wasting space, any making any updates that might be necessary to nodes of the linked list that we traversed.
THEN, we have to run similar operations later for de-allocation. And if we’re talking about a memory-managed language, well, that means running a garbage collector algorithm, periodically, that needs to somehow inspect blocks that are in use to see if they’re still in use, or can be automatically de-allocated. The most common garbage-collector I know of involves tagging all references within other objects, so that the GC can start at the “root” objects and walk the entire tree of references within references, in order to find any that are orphaned, and identify them as collectable.
My bread and butter is C#, so let’s look at an actual example.
public class MyMutableObject
{
public required ulong Id { get; set; }
public required string Name { get; set; }
}
public record MyImmutableObject
{
public required ulong Id { get; init; }
public required string Name { get; init; }
}
_immutableInstance = new()
{
Id = 1,
Name = "First"
};
_mutableInstance = new()
{
Id = 1,
Name = "First"
};
[Benchmark(Baseline = true)]
public MyMutableObject MutableEdit()
{
_mutableInstance.Name = "Second";
return _mutableInstance;
}
[Benchmark]
public MyImmutableObject ImmutableEdit()
=> _immutableInstance with
{
Name = "Second"
};
Method | Mean | Error | StdDev | Ratio | RatioSD | Gen0 | Allocated | Alloc Ratio |
---|---|---|---|---|---|---|---|---|
MutableEdit | 1.080 ns | 0.0876 ns | 0.1439 ns | 1.02 | 0.19 | - | - | NA |
ImmutableEdit | 8.282 ns | 0.2287 ns | 0.3353 ns | 7.79 | 1.03 | 0.0076 | 32 B | NA |
Even for the most basic edit operation, immutable copying is slower by more than 7 times, and (obviously) allocates more memory, which translates to more cost to be spent on garbage collection later.
Let’s scale it up to a slightly-more realistic immutable data structure.
public class MyMutableParentObject
{
public required ulong Id { get; set; }
public required string Name { get; set; }
public required MyMutableChildObject Child { get; set; }
}
public class MyMutableChildObject
{
public required ulong Id { get; set; }
public required string Name { get; set; }
public required MyMutableGrandchildObject FirstGrandchild { get; set; }
public required MyMutableGrandchildObject SecondGrandchild { get; set; }
public required MyMutableGrandchildObject ThirdGrandchild { get; set; }
}
public class MyMutableGrandchildObject
{
public required ulong Id { get; set; }
public required string Name { get; set; }
}
public record MyImmutableParentObject
{
public required ulong Id { get; set; }
public required string Name { get; set; }
public required MyImmutableChildObject Child { get; set; }
}
public record MyImmutableChildObject
{
public required ulong Id { get; set; }
public required string Name { get; set; }
public required MyImmutableGrandchildObject FirstGrandchild { get; set; }
public required MyImmutableGrandchildObject SecondGrandchild { get; set; }
public required MyImmutableGrandchildObject ThirdGrandchild { get; set; }
}
public record MyImmutableGrandchildObject
{
public required ulong Id { get; set; }
public required string Name { get; set; }
}
_immutableTree = new()
{
Id = 1,
Name = "Parent",
Child = new()
{
Id = 2,
Name = "Child",
FirstGrandchild = new()
{
Id = 3,
Name = "First Grandchild"
},
SecondGrandchild = new()
{
Id = 4,
Name = "Second Grandchild"
},
ThirdGrandchild = new()
{
Id = 5,
Name = "Third Grandchild"
},
}
};
_mutableTree = new()
{
Id = 1,
Name = "Parent",
Child = new()
{
Id = 2,
Name = "Child",
FirstGrandchild = new()
{
Id = 3,
Name = "First Grandchild"
},
SecondGrandchild = new()
{
Id = 4,
Name = "Second Grandchild"
},
ThirdGrandchild = new()
{
Id = 5,
Name = "Third Grandchild"
},
}
};
[Benchmark(Baseline = true)]
public MyMutableParentObject MutableEdit()
{
_mutableTree.Child.SecondGrandchild.Name = "Second Grandchild Edited";
return _mutableTree;
}
[Benchmark]
public MyImmutableParentObject ImmutableEdit()
=> _immutableTree with
{
Child = _immutableTree.Child with
{
SecondGrandchild = _immutableTree.Child.SecondGrandchild with
{
Name = "Second Grandchild Edited"
}
}
};
Method | Mean | Error | StdDev | Ratio | RatioSD | Gen0 | Allocated | Alloc Ratio |
---|---|---|---|---|---|---|---|---|
MutableEdit | 1.129 ns | 0.0840 ns | 0.0825 ns | 1.00 | 0.10 | - | - | NA |
ImmutableEdit | 32.685 ns | 0.8503 ns | 2.4534 ns | 29.09 | 2.95 | 0.0306 | 128 B | NA |
Not only is performance worse, but it drops off exponentially, as you scale out the size of your immutable structures.
Now, all this being said, I myself use the immutable object pattern FREQUENTLY, in both C# and JavaScript. There’s a lot of problems you encounter in business logic that it solves really well, and it’s basically the ideal type of data structure for use in reactive programming, which is extremely effective for building GUIs. In other words, I use immutable objects a ton when I’m building out the business layer of a UI, where data is king. If I were writing code within any of the frameworks I use to BUILD those UIs (.NET, WPF, ReactiveExtensions) you can bet I’d be using immutable objects way more sparingly.
Recall elections are a thing, are they not?
A function call of “MyFunction(parameter: GLFW_TRUE)” is more readable than “MyFunction(parameter: 1)”. Not by much, mind you, but if given the choice between these two, one is clearly better. It requires no assumptions about what the reader may or may not already know about the system.It communicates intent without any ambiguity.
It’s fraud. They publicly claimed, point-blank, to do a certain thing for years, and were instead doing the opposite, in the interest of making more money. The affiliate link thing is only one of several points that they’re suing over. The far more egregious one is that they don’t actually “scour the internet to find you the best coupons” They will actively hide better coupons that they know about, if marketplaces pay them to, and still tell you in the browser “this is the best coupon.”
What are you considering as “paint[ing] UI-elements” in this context? I don’t see anything I would describe as “painting” in the code snippets ay those links.
Look into getting SmartTube for the TV. Works pretty great for me
The two models, […] each offer a minimum of 3TB per disk
Huh? The hell is this supposed to mean? Are they talking about the internal platters?
I meeeeeeean… this seems like a question they SHOULD be asking. No, it’s not sustainable as a for-profit business, which is why they should get out of the business and leave it to non-profits and state institutions.
What DOES the new scanner do with its scan output, then?
Image is broken?
He alluded to it in the first video, and I think it’s spot on.
They ended up with an “inventory problem”. Which is to say, some business major in the company somewhere, or a consultant or whatever saw that they were spending money to store it all, and said “A company’s assets should never cost money, they should MAKE money” or some such business speak. Ultimately that translated into every layer of the business being instructed to prioritize using that that old inventory, somehow, or pushing it to customers.
“People don’t really want to buy all this older hardware off of us, but we can convince people who don’t know any better to rent it.”
“We don’t have enough 4090s to keep up with demand for these high-end rentals, but we’re sure as hell not buying more when we have all these perfectly-good 4080s lying around.”
A little off-topic: anyone else read this as “BCA Chefs”, initially?
If only it were that easy to snap your fingers and magically transform your code base from C to Rust. Spoiler alert: It’s not.
How utterly disingenuous. That’s not what the CISA recommendation says, at all.
As if the new notepad wasn’t already enough of a downgrade.
I mean, the book of Revelations is indeed a prophecy.
This really reads to me like the perspective of a business major whose only concept of productivity is about what looks good on paper. He seems to think it’s a desirable goal for EVERY project to be completed with 0 latency. That’s absurd. If every single incoming requirement is a “top priority, this needs to go out as soon as possible” that’s a management failure. They either need to ACTUALLY prioritize requirements properly, or they need to bring in more people.
For the Chuck and Patty example, he describes Chuck finishing a task and sending it to Patty for review, and Patty not picking it up because she’s “busy.” Busy with what? If this task is the higher priority, why is she not switching to it as soon as it’s ready? Do either Chuck or Patty not know that this task is the current highest priority? Sounds like management failure. Is there not a system in place (whether automatic or not) for notifying people when high priority tasks are assigned? Also sounds like management failure. Is Patty just incapable of switching tasks within 30-60 minutes? She needs to work on her organization skills, or that management isn’t providing sufficient tooling for multitasking.
When a top-priority “this needs to go out ASAP” task is in play on my team, I’m either working on it, or I know it’s coming my way soon, and who it’s coming from, because my Project Lead has already coordinated that among all of us. Because that’s her job.
From the article…
Project A should take around 2 weeks
Project B should take around 2 weeks
That’s 4 weeks to complete them both
But only if they’re done in sequence!
If you try to do them at the same time, with the same team, don’t be surprised if it ends up taking 6 weeks!
Nonsense. If these are both top priorities, and the team has proper leadership, (and the 2 week estimate is actually accurate) 4 weeks is entirely achievable. If these are not top priorities, and the team has other work as well, then yeah, no shit it might be 6 weeks. You can’t just ignore the 2 weeks from Project C if it’s prioritized similarly to A and B. If A and B NEED to go out in 4 weeks, then prioritize them higher, and coordinate your team to make that happen.
Yeah, I was thinking the same thing, when I first heard this story. He’s not saying he hopes Mehdi dies, he made an extremely cavalier joke about him dying. Just as bad, but, y’know, not as good of a sound bite.
You’ve never met an average ASP.NET developer?
Props to whoever this company is. This is one of the best bits of customer service I’ve seen in years.