Skip to content Skip to footer

Zap Those Dupes: The Ultimate Guide to Removing Duplicates from Arrays in JavaScript

Hey there, fellow code wranglers! We’ve all been there, staring down a wild array chock-full of duplicates, thinking, “How the heck do I tame this beast?” Well, buckle up because I’m about to show you how to zap those pesky dupes into oblivion using pure JavaScript goodness.

The Classic: filter and indexOf

Let’s kick things off with a classic move, the ol’ filter and indexOf combo. This approach is like the trusty hammer in your toolbox — not the fanciest, but it gets the job done.

const uniqueArray = yourArray.filter((item, index, self) => {
  return self.indexOf(item) === index;
});

Here’s the play-by-play: we’re filtering through yourArray, and for each item, we check if the index of that item is the first occurrence in the array. If it is, that means it’s not a duplicate, and we keep it. Otherwise, it gets the boot.

Set It and Forget It: Using Set

If you’re looking to level up and use some of the newer JavaScript features, then Set is your new best friend. Set is like the bouncer at the club door, only letting in the unique values.

const uniqueArray = [...new Set(yourArray)];

Short, sweet, and to the point. We create a Set from yourArray, which automatically weeds out the duplicates. Then we spread it back into an array because, well, we’re not savages.

Reduce It to Uniqueness: reduce Method

For those who like to get a little fancy with their array manipulation, reduce is like the Swiss Army knife of array methods. It’s a bit more verbose, but hey, sometimes you gotta show off those coding chops.

const uniqueArray = yourArray.reduce((accumulator, item) => {
  return accumulator.includes(item) ? accumulator : [...accumulator, item];
}, []);

We’re building up a new array from scratch with reduce. For each item, we check if it’s already in our accumulator array. If it’s not, we add it. If it is, we just return the accumulator as-is, effectively skipping the duplicate.

The Higher-Order Function: uniq with Lodash

Okay, so this isn’t pure JavaScript, but sometimes you need to stand on the shoulders of giants, and Lodash is a giant when it comes to utility functions. Using its uniq function is like hiring a professional organizer to declutter your array.

const uniqueArray = _.uniq(yourArray);

Yup, that’s it. Just call _.uniq and hand it your array. Lodash takes it from there and hands you back an array free of duplicates. It’s almost like cheating, but in a good way.

For the Performance Junkies: Set and forEach

If you’re the type who obsesses over performance and can’t stand the thought of unnecessary iterations, this one’s for you. It’s a bit more manual but can be more performant for large datasets.

const uniqueArray = [];
const uniqueSet = new Set();

yourArray.forEach(item => {
  if (!uniqueSet.has(item)) {
    uniqueSet.add(item);
    uniqueArray.push(item);
  }
});

We’re using a Set to track uniqueness, but instead of converting the whole thing back to an array, we’re building the array as we go. Each item only gets added if it’s not already in the Set.


Alright, code slingers, that’s the first half of our journey into de-duplication. We’ve covered the basics, the fancy, the even fancier, and the performance-centric methods of removing duplicates from an array in JavaScript. Stay tuned for the second half, where we’ll dive into even more methods, edge cases, and performance considerations. Keep your keyboards at the ready!

Dive Deeper: Edge Cases and Object Uniqueness

Welcome back, code warriors! Now that we’ve conquered the basics, let’s tackle some trickier scenarios. What if our array is a jumbled mess of objects or mixed types? Fear not, for we have the tools to handle these with grace.

Object Uniqueness by Property

Arrays of objects are like a crowd of people — each with their own characteristics. To filter out duplicates based on a specific property, we’ll need to improvise with a custom function.

const uniqueByProperty = (arr, prop) => {
  const unique = new Map();
  arr.forEach((item) => {
    if (!unique.has(item[prop])) {
      unique.set(item[prop], item);
    }
  });
  return Array.from(unique.values());
};

const arrayOfObjects = [{ id: 1 }, { id: 2 }, { id: 1 }];
const uniqueObjects = uniqueByProperty(arrayOfObjects, 'id');

In this snippet, we’re using a Map to keep track of unique objects based on their id property. The Map object is a collection of key-value pairs with fast lookup times, making it perfect for this job.

Mixed Type Deduplication

Sometimes you’re dealing with an array that’s like a party mix — a little bit of everything. To handle mixed types, we need to be a bit more descriptive in our comparison.

const uniqueMixedArray = yourArray.reduce((accumulator, item) => {
  const itemIdentifier = typeof item + JSON.stringify(item);
  return accumulator.some((elem) => {
    return typeof elem + JSON.stringify(elem) === itemIdentifier;
  })
    ? accumulator
    : [...accumulator, item];
}, []);

Here, we’re using a combination of typeof and JSON.stringify to create a unique identifier for each item, regardless of its type. This way, we can ensure that 1 (a number) and '1' (a string) are treated as distinct values.

Performance Considerations

When it comes to big data sets, performance can take a hit if we’re not careful. Let’s talk about some best practices to keep your code running faster than a cheetah on a sugar rush.

  • Use Set when possible: As we’ve seen, Set is a high-performance way to ensure uniqueness, especially when you’re dealing with primitive values.
  • Avoid unnecessary iterations: Methods like filter and indexOf can be costly on large arrays because they iterate multiple times. Where efficiency is key, aim for a single pass.
  • Consider space complexity: While creating new arrays or sets, keep an eye on memory usage. Sometimes, in-place modifications (though they have their downsides) can save on space.

Going Beyond Arrays: ES6 Maps for Unique Keys

Sometimes, you’re not just dealing with arrays — you’re juggling maps, too. ES6 Map objects can be a powerful ally when you need unique keys.

const uniqueMap = new Map();
yourArray.forEach((item) => {
  uniqueMap.set(item.id, item);
});
const uniqueArrayFromMap = Array.from(uniqueMap.values());

In this example, we’re using the unique key constraint of Map to our advantage. By setting the id as the key, we ensure that each entry remains unique.

Wrapping Up

And there you have it, folks — a comprehensive guide to removing duplicates from arrays in JavaScript, covering everything from the basics to the nitty-gritty details. Whether you’re dealing with simple lists or complex objects, there’s a strategy to fit your needs.

Remember, the key to mastering JavaScript is understanding the tools at your disposal and knowing when to use them. With this guide in hand, you’re well-equipped to tackle any duplicate-related challenges that come your way.

So go forth, and may your arrays always be as unique as your coding style! Keep crafting those beautiful lines of code, and never stop learning. Happy coding!