Assuming I have the following:
var array =
[
{"name":"Joe", "age":17},
{"name":"Bob", "age":17},
{"name":"Carl", "age": 35}
]
What is the best way to be able to get an array of all of the distinct ages such that I get an result array of:
[17, 35]
Is there some way I could alternatively structure the data or better method such that I would not have to iterate through each array checking the value of "age" and check against another array for its existence, and add it if not?
If there was some way I could just pull out the distinct ages without iterating...
Current inefficent way I would like to improve... If it means that instead of "array" being an array of objects, but a "map" of objects with some unique key (i.e. "1,2,3") that would be okay too. Im just looking for the most performance efficient way.
The following is how I currently do it, but for me, iteration appears to just be crummy for efficiency even though it does work...
var distinct = []
for (var i = 0; i < array.length; i++)
if (array[i].age not in distinct)
distinct.push(array[i].age)
Set
object and map
s are wasteful. This job just takes a simple .reduce()
stage.
If you are using ES6/ES2015 or later you can do it this way:
const data = [
{ group: 'A', name: 'SD' },
{ group: 'B', name: 'FI' },
{ group: 'A', name: 'MM' },
{ group: 'B', name: 'CO'}
];
const unique = [...new Set(data.map(item => item.group))]; // [ 'A', 'B']
Here is an example on how to do it.
For those who want to return object with all properties unique by key
const array = [ { "name": "Joe", "age": 17 }, { "name": "Bob", "age": 17 }, { "name": "Carl", "age": 35 } ] const key = 'age'; const arrayUniqueByKey = [...new Map(array.map(item => [item[key], item])).values()]; console.log(arrayUniqueByKey); /*OUTPUT [ { "name": "Bob", "age": 17 }, { "name": "Carl", "age": 35 } ] */ // Note: this will pick the last duplicated item in the list.
Bob
and Joe
object. after I research, It need few iteration to get what I wanted. Maybe there is one liner chaining function that I didn't know of
.filter(Boolean)
before the map call. If any of your objects in the array are null
then this will throw Cannot read properties of null (reading 'id')
using ES6
let array = [
{ "name": "Joe", "age": 17 },
{ "name": "Bob", "age": 17 },
{ "name": "Carl", "age": 35 }
];
array.map(item => item.age)
.filter((value, index, self) => self.indexOf(value) === index)
> [17, 35]
array.filter((value, index, self) => self.map(x => x.age).indexOf(value.age) == index)
Using ES6 features, you could do something like:
const uniqueAges = [...new Set( array.map(obj => obj.age)) ];
const uniqueObjects = [ ...new Set( array.map( obj => obj.age) ) ].map( age=> { return array.find(obj => obj.age === age) } )
If this were PHP I'd build an array with the keys and take array_keys
at the end, but JS has no such luxury. Instead, try this:
var flags = [], output = [], l = array.length, i;
for( i=0; i<l; i++) {
if( flags[array[i].age]) continue;
flags[array[i].age] = true;
output.push(array[i].age);
}
array_unique
would compare the entire item, not just the age as is asked here.
flags = {}
is more better than flags = []
age
is a relatively small integer (<120 surely)
You could use a dictionary approach like this one. Basically you assign the value you want to be distinct as a key in the "dictionary" (here we use an array as an object to avoid dictionary-mode). If the key did not exist then you add that value as distinct.
Here is a working demo:
var array = [{"name":"Joe", "age":17}, {"name":"Bob", "age":17}, {"name":"Carl", "age": 35}]; var unique = []; var distinct = []; for( let i = 0; i < array.length; i++ ){ if( !unique[array[i].age]){ distinct.push(array[i].age); unique[array[i].age] = 1; } } var d = document.getElementById("d"); d.innerHTML = "" + distinct;
This will be O(n) where n is the number of objects in array and m is the number of unique values. There is no faster way than O(n) because you must inspect each value at least once.
The previous version of this used an object, and for in. These were minor in nature, and have since been minorly updated above. However, the reason for a seeming advance in performance between the two versions in the original jsperf was due to the data sample size being so small. Thus, the main comparison in the previous version was looking at the difference between the internal map and filter use versus the dictionary mode lookups.
I have updated the code above, as noted, however, I have also updated the jsperf to look through 1000 objects instead of 3. 3 overlooked many of the performance pitfalls involved (obsolete jsperf).
Performance
https://jsperf.com/filter-vs-dictionary-more-data When I ran this dictionary was 96% faster.
https://i.stack.imgur.com/HZTBu.png
if( typeof(unique[array[i].age]) == "undefined"){ distinct.push(array[i].age); unique[array[i].age] = 0; }
This is how you would solve this using new Set via ES6 for Typescript as of August 25th, 2017
Array.from(new Set(yourArray.map((item: any) => item.id)))
Array.from(new Set(yourArray.map((item) => item.id)))
I'd just map and remove dups:
var ages = array.map(function(obj) { return obj.age; });
ages = ages.filter(function(v,i) { return ages.indexOf(v) == i; });
console.log(ages); //=> [17, 35]
Edit: Aight! Not the most efficient way in terms of performance, but the simplest most readable IMO. If you really care about micro-optimization or you have huge amounts of data then a regular for
loop is going to be more "efficient".
if
s. you'll get some very different results with three million.
var unique = array
.map(p => p.age)
.filter((age, index, arr) => arr.indexOf(age) == index)
.sort(); // sorting is optional
// or in ES6
var unique = [...new Set(array.map(p => p.age))];
// or with lodash
var unique = _.uniq(_.map(array, 'age'));
ES6 example
const data = [
{ name: "Joe", age: 17},
{ name: "Bob", age: 17},
{ name: "Carl", age: 35}
];
const arr = data.map(p => p.age); // [17, 17, 35]
const s = new Set(arr); // {17, 35} a set removes duplications, but it's still a set
const unique = [...s]; // [17, 35] Use the spread operator to transform a set into an Array
// or use Array.from to transform a set into an array
const unique2 = Array.from(s); // [17, 35]
There are many valid answers already, but I wanted to add one that uses only the reduce()
method because it is clean and simple.
function uniqueBy(arr, prop){
return arr.reduce((a, d) => {
if (!a.includes(d[prop])) { a.push(d[prop]); }
return a;
}, []);
}
Use it like this:
var array = [
{"name": "Joe", "age": 17},
{"name": "Bob", "age": 17},
{"name": "Carl", "age": 35}
];
var ages = uniqueBy(array, "age");
console.log(ages); // [17, 35]
const array = [ {"id":"93","name":"CVAM_NGP_KW"}, {"id":"94","name":"CVAM_NGP_PB"}, {"id":"93","name":"CVAM_NGP_KW"}, {"id":"94","name":"CVAM_NGP_PB"} ] function uniq(array, field) { return array.reduce((accumulator, current) => { if(!accumulator.includes(current[field])) { accumulator.push(current[field]) } return accumulator; }, [] ) } const ids = uniq(array, 'id'); console.log(ids) /* output ["93", "94"] */
This is a slight variation on the ES6 version if you need the entire object:
let arr = [
{"name":"Joe", "age":17},
{"name":"Bob", "age":17},
{"name":"Carl", "age": 35}
]
arr.filter((a, i) => arr.findIndex((s) => a.age === s.age) === i) // [{"name":"Joe", "age":17}, {"name":"Carl", "age": 35}]
age
. Thanks for spotting
I have a small solution
let data = [{id: 1}, {id: 2}, {id: 3}, {id: 2}, {id: 3}];
let result = data.filter((value, index, self) => self.findIndex((m) => m.id === value.id) === index);
The forEach
version of @travis-j's answer (helpful on modern browsers and Node JS world):
var unique = {};
var distinct = [];
array.forEach(function (x) {
if (!unique[x.age]) {
distinct.push(x.age);
unique[x.age] = true;
}
});
34% faster on Chrome v29.0.1547: http://jsperf.com/filter-versus-dictionary/3
And a generic solution that takes a mapper function (tad slower than direct map, but that's expected):
function uniqueBy(arr, fn) {
var unique = {};
var distinct = [];
arr.forEach(function (x) {
var key = fn(x);
if (!unique[key]) {
distinct.push(key);
unique[key] = true;
}
});
return distinct;
}
// usage
uniqueBy(array, function(x){return x.age;}); // outputs [17, 35]
I've started sticking Underscore in all new projects by default just so I never have to think about these little data-munging problems.
var array = [{"name":"Joe", "age":17}, {"name":"Bob", "age":17}, {"name":"Carl", "age": 35}];
console.log(_.chain(array).map(function(item) { return item.age }).uniq().value());
Produces [17, 35]
.
Here's another way to solve this:
var result = {};
for(var i in array) {
result[array[i].age] = null;
}
result = Object.keys(result);
or
result = Object.values(result);
I have no idea how fast this solution is compared to the others, but I like the cleaner look. ;-)
EDIT: Okay, the above seems to be the slowest solution of all here.
I've created a performance test case here: http://jsperf.com/distinct-values-from-array
Instead of testing for the ages (Integers), I chose to compare the names (Strings).
Method 1 (TS's solution) is very fast. Interestingly enough, Method 7 outperforms all other solutions, here I just got rid of .indexOf()
and used a "manual" implementation of it, avoiding looped function calling:
var result = [];
loop1: for (var i = 0; i < array.length; i++) {
var name = array[i].name;
for (var i2 = 0; i2 < result.length; i2++) {
if (result[i2] == name) {
continue loop1;
}
}
result.push(name);
}
The difference in performance using Safari & Firefox is amazing, and it seems like Chrome does the best job on optimization.
I'm not exactly sure why the above snippets is so fast compared to the others, maybe someone wiser than me has an answer. ;-)
using lodash
var array = [
{ "name": "Joe", "age": 17 },
{ "name": "Bob", "age": 17 },
{ "name": "Carl", "age": 35 }
];
_.chain(array).pluck('age').unique().value();
> [17, 35]
var array = [ {"name":"Joe", "age":17}, {"name":"Bob", "age":17}, {"name":"Carl", "age": 35} ]; const ages = [...new Set(array.reduce((a, c) => [...a, c.age], []))]; console.log(ages);
const array = [ { "name": "Joe", "age": 17 }, { "name": "Bob", "age": 17 }, { "name": "Carl", "age": 35 } ] const key = 'age'; const arrayUniqueByKey = [...new Map(array.map(item => [item[key], item])).values()]; console.log(arrayUniqueByKey);
Simple distinct filter using Maps :
let array = [ {"name":"Joe", "age":17}, {"name":"Bob", "age":17}, {"name":"Carl", "age": 35} ]; let data = new Map(); for (let obj of array) { data.set(obj.age, obj); } let out = [...data.values()]; console.log(out);
Using Lodash
var array = [
{ "name": "Joe", "age": 17 },
{ "name": "Bob", "age": 17 },
{ "name": "Carl", "age": 35 }
];
_.chain(array).map('age').unique().value();
Returns [17,35]
function get_unique_values_from_array_object(array,property){
var unique = {};
var distinct = [];
for( var i in array ){
if( typeof(unique[array[i][property]]) == "undefined"){
distinct.push(array[i]);
}
unique[array[i][property]] = 0;
}
return distinct;
}
underscore.js _.uniq(_.pluck(array,"age"))
Here's a versatile solution that uses reduce, allows for mapping, and maintains insertion order.
items: An array
mapper: A unary function that maps the item to the criteria, or empty to map the item itself.
function distinct(items, mapper) {
if (!mapper) mapper = (item)=>item;
return items.map(mapper).reduce((acc, item) => {
if (acc.indexOf(item) === -1) acc.push(item);
return acc;
}, []);
}
Usage
const distinctLastNames = distinct(items, (item)=>item.lastName);
const distinctItems = distinct(items);
You can add this to your Array prototype and leave out the items parameter if that's your style...
const distinctLastNames = items.distinct( (item)=>item.lastName) ) ;
const distinctItems = items.distinct() ;
You can also use a Set instead of an Array to speed up the matching.
function distinct(items, mapper) {
if (!mapper) mapper = (item)=>item;
return items.map(mapper).reduce((acc, item) => {
acc.add(item);
return acc;
}, new Set());
}
If you want to have an unique list of objects returned back. here is another alternative:
const unique = (arr, encoder=JSON.stringify, decoder=JSON.parse) =>
[...new Set(arr.map(item => encoder(item)))].map(item => decoder(item));
Which will turn this:
unique([{"name": "john"}, {"name": "sarah"}, {"name": "john"}])
into
[{"name": "john"}, {"name": "sarah"}]
The trick here is that we are first encoding the items into strings using JSON.stringify
, and then we are converting that to a Set (which makes the list of strings unique) and then we are converting it back to the original objects using JSON.parse
.
var array = [ {"name":"Joe", "age":17}, {"name":"Bob", "age":17}, {"name":"Carl", "age": 35} ] console.log(Object.keys(array.reduce((r,{age}) => (r[age]='', r) , {})))
Output:
Array ["17", "35"]
Just found this and I thought it's useful
_.map(_.indexBy(records, '_id'), function(obj){return obj})
Again using underscore, so if you have an object like this
var records = [{_id:1,name:'one', _id:2,name:'two', _id:1,name:'one'}]
it will give you the unique objects only.
What happens here is that indexBy
returns a map like this
{ 1:{_id:1,name:'one'}, 2:{_id:2,name:'two'} }
and just because it's a map, all keys are unique.
Then I'm just mapping this list back to array.
In case you need only the distinct values
_.map(_.indexBy(records, '_id'), function(obj,key){return key})
Keep in mind that the key
is returned as a string so, if you need integers instead, you should do
_.map(_.indexBy(records, '_id'), function(obj,key){return parseInt(key)})
i think you are looking for groupBy function (using Lodash)
_personsList = [{"name":"Joe", "age":17},
{"name":"Bob", "age":17},
{"name":"Carl", "age": 35}];
_uniqAgeList = _.groupBy(_personsList,"age");
_uniqAges = Object.keys(_uniqAgeList);
produces result:
17,35
jsFiddle demo:http://jsfiddle.net/4J2SX/201/
[...new Set([
{ "name": "Joe", "age": 17 },
{ "name": "Bob", "age": 17 },
{ "name": "Carl", "age": 35 }
].map(({ age }) => age))]
Primitive Types
var unique = [...new Set(array.map(item => item.pritiveAttribute))];
For complex types e.g. Objects
var unique = [...new DeepSet(array.map(item => item.Object))];
export class DeepSet extends Set {
add (o: any) {
for (let i of this)
if (this.deepCompare(o, i))
return this;
super.add.call(this, o);
return this;
};
private deepCompare(o: any, i: any) {
return JSON.stringify(o) === JSON.stringify(i)
}
}
Success story sharing
TypeError: (intermediate value).slice is not a function