One of the biggest issues in React that I have come across is that you have to make copies of objects and arrays (as mentioned in this article) when calling setState(). Cloning works well for small arrays and objects (I use the code below for that), but for large arrays with thousands of items (for example, datagrid scenario) this is inefficient. My solution is to modify the array/object directly then just call setState({}). Does anyone have a better solution?
export function deepClone<T>(obj: T): T {
if (obj == null || typeof obj !== 'object')
return obj;
const clone = new (obj as any).constructor();
for (let key in obj) {
if (obj.hasOwnProperty(key))
clone[key] = deepClone(obj[key]);
}
return clone;
}
Deep cloning is almost never the right answer - you'll end up copying _too many_ object references, which is extra work, and that can also lead to unnecessary re-renders depending on what your components are trying to do.
A correct immutable update is more like a "nested shallow update", copying all levels of nested keypaths leading to the actual value you want to update.
If you prefer to avoid writing those immutable operations by hand, Immer is a fantastic tool for simplifying the code by using "mutating" syntax that gets turned into a safe and correct immutable update:
immer makes an "efficient copy" of the object, right? Keeping references to some nested objects in the original object when possible? If so that's cool... but not sure it is worth the effort.
Immer does a correct nested immutable update, same as if you wrote the corresponding nested spread operations by hand. It _only_ updates the nesting paths that led to the value that got modified.
So yes, if you have a nested object with a bunch of different fields, and you update `state.a.b.c = 123`, it makes copies of `b`, `a`, and `state`, but preserves all other nested references that are anywhere inside of `state`.
Others have mentioned efficient ways to work with immutable data structures, but I want to mention another solution:
In cases where I want to get ideal performance, I sometimes separate out a plain JavaScript service and wrap it in a React hook.
So for example, let’s say you have a massive array of like 100,000 objects. You presumably aren’t putting that entire array on the screen at once. So you could keep the array in a normal JavaScript class, keep in instance of that class in state, fire events into it, and then query out the slices of data you actually need.
Those smaller slices can be generated each time there is a state change, but you don’t need to re-allocate the big array.
Even if you do need to show the entire array, say as a data plot, you can render directly to a canvas, but still wrap that canvas and the I/O in a React hook. React works quite well with these “plain JavaScript” escape hatches in the rare case they’re needed.
The other place I’ve been using this approach is for drag and drop hooks, where the complexity of managing React renders and coordinating state changes on every mouse move is just too much. Instead I have service class that updates the few DOM nodes that need to be changed every frame, and I only fire off state changes when there are meaningful transitions (hover over something, drop something, etc.)
This is what the "immutability-helper" library and immer are used for. Or Immutablejs to go even more fancy.
It's not that expensive. You don't need to deep clone an array to change one element, you just create a new array linking to all the old elements, except the one you want to change plus the new version. If you understand "referential equality", this will make a lot more sense.
You only need to update the objects that actually changed, and only along the path of the keys that changed. Deep cloning is overkill because you'll incur way too many updates.
Check out useReducer and all your problems go away! (It has no relation to Redux, many people think this but it’s a native React hook that makes updating keys within an object much easier)