Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

One of the biggest issues in React that I have come across is that you have to make copies of objects and arrays (as mentioned in this article) when calling setState(). Cloning works well for small arrays and objects (I use the code below for that), but for large arrays with thousands of items (for example, datagrid scenario) this is inefficient. My solution is to modify the array/object directly then just call setState({}). Does anyone have a better solution?

   export function deepClone<T>(obj: T): T {
       if (obj == null || typeof obj !== 'object')
           return obj;

       const clone = new (obj as any).constructor();

       for (let key in obj) {
           if (obj.hasOwnProperty(key))
               clone[key] = deepClone(obj[key]);
       }

       return clone;
   }


Deep cloning is almost never the right answer - you'll end up copying _too many_ object references, which is extra work, and that can also lead to unnecessary re-renders depending on what your components are trying to do.

A correct immutable update is more like a "nested shallow update", copying all levels of nested keypaths leading to the actual value you want to update.

If you prefer to avoid writing those immutable operations by hand, Immer is a fantastic tool for simplifying the code by using "mutating" syntax that gets turned into a safe and correct immutable update:

- https://immerjs.github.io/immer/

- https://beta.reactjs.org/learn/updating-objects-in-state#wri...

We even use Immer by default in Redux Toolkit:

- https://redux-toolkit.js.org/usage/immer-reducers


immer makes an "efficient copy" of the object, right? Keeping references to some nested objects in the original object when possible? If so that's cool... but not sure it is worth the effort.


Immer does a correct nested immutable update, same as if you wrote the corresponding nested spread operations by hand. It _only_ updates the nesting paths that led to the value that got modified.

So yes, if you have a nested object with a bunch of different fields, and you update `state.a.b.c = 123`, it makes copies of `b`, `a`, and `state`, but preserves all other nested references that are anywhere inside of `state`.

It's the same as if you wrote:

    return {
      ...state,
      a: {
        ...state.a,
        b: {
          ...state.a.b,
          c: 123
        }
      }
    }
but obviously much shorter and easier to read / maintain :)

It also conveniently eliminates the chance of a _real_ accidental mutation (which in the case of Redux was always the #1 cause of bugs in Redux apps).


That's awesome! It lets me use POJO (Plain Old JavaScript Objects), and it makes efficient copies? Definitely going to give this a try.


Others have mentioned efficient ways to work with immutable data structures, but I want to mention another solution:

In cases where I want to get ideal performance, I sometimes separate out a plain JavaScript service and wrap it in a React hook.

So for example, let’s say you have a massive array of like 100,000 objects. You presumably aren’t putting that entire array on the screen at once. So you could keep the array in a normal JavaScript class, keep in instance of that class in state, fire events into it, and then query out the slices of data you actually need.

Those smaller slices can be generated each time there is a state change, but you don’t need to re-allocate the big array.

Even if you do need to show the entire array, say as a data plot, you can render directly to a canvas, but still wrap that canvas and the I/O in a React hook. React works quite well with these “plain JavaScript” escape hatches in the rare case they’re needed.

The other place I’ve been using this approach is for drag and drop hooks, where the complexity of managing React renders and coordinating state changes on every mouse move is just too much. Instead I have service class that updates the few DOM nodes that need to be changed every frame, and I only fire off state changes when there are meaningful transitions (hover over something, drop something, etc.)


This is what the "immutability-helper" library and immer are used for. Or Immutablejs to go even more fancy.

It's not that expensive. You don't need to deep clone an array to change one element, you just create a new array linking to all the old elements, except the one you want to change plus the new version. If you understand "referential equality", this will make a lot more sense.


Try going all in on persistent data structures. I haven't tried this one but perhaps it will do the trick: https://github.com/cronokirby/persistent-ts


I prefer to use POJO (plain old JavaScript objects).


You only need to update the objects that actually changed, and only along the path of the keys that changed. Deep cloning is overkill because you'll incur way too many updates.

For example:

    const [largeArray, setLargeArray] = useState<MyObj[]>([]);
    const updateObject = (index: number) => (newObj: MyObj) => {
        setLargeArray(prevArray => ([
            ...prevArray.slice(0, index),
            newObj,
            ...prevArray.slice(index+1),
        ]));
    }


And if you don't want to write it yourself, use a library.


Check out useReducer and all your problems go away! (It has no relation to Redux, many people think this but it’s a native React hook that makes updating keys within an object much easier)


Just read the docs for useReducer. It says you should not change the original object. You have to make a copy.


Yes, that's the best way. Fortunately, there are relatively few components that need a hack like that.

PS Why did you post code a `deepClone` function? You certainly don't need to deep clone.


deepClone is convenient, when you want to modify a field that's deep inside the object.


You should only be cloning the modified object, not every single object in the array. That's what making things slow!


Yep mystery solved




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: