I am attempting to use the Google Places API in order to get the place name of a location I am in.
The returned data structure has the following types:
descriptor1: 'street number' | 'neighborhood' | 'postcode' | 'route' | 'locality' | 'postal_town' | 'administrative_area_level_2' | 'administrative_area_level_1' | 'country'
places: [
{
address_components: [{
long_name: 'string',
short_name: 'string',
types: {
0: descriptor1,
1?: descriptor2
}
}],
other_fields not relevant here
}
]
There is no guarantee how many address components any given place will have, or if any of them even have any. There is no guarantee what types will and will not be represented.
I would like to write code that returns the long_name of the first address_component which has a field R.get(R.lensPath('types', '0')) of 'neighborhood' if one exists locality otherwise, then postal_town, administrative_area_level_2, then administrative_area_level_1 and then country.
So I start with R.pluck('address_components', places). Now I could construct an object, reduce the list down to an object, insert the first of each of the keys I am interested in into the object and then find a value. something like:
const interestingTypes = ['neighborhood', 'locality', 'postal_town', 'administrative_area 2', 'administrative_area_1', 'country']
const res = R.mergeAll(R.pluck('address_components', places).map((addressComponentList) => addressComponentList.reduce((memo, addressComponent) => {
if (interestingTypes.indexOf(addressComponent.types[0]) !== -1) {
if (!memo[addressComponent.types[0]]) {
memo[addressComponent.types[0]] = addressComponent.long_name
}
}
return memo
},{})))
res[R.find((type) => (Object.keys(res).indexOf(type) !== -1), interestingTypes)]
While it is certainly true that this can be made marginally more idiomatic be replacing all of the native .reduce and .map with R.map/R.reduce this does not really address the fundamental problems.
1) This will iterate through every single member of the list even after finding the result.
2) The resulting structure still needs to be iterated(with the find for instance) to actually find the tightest bound.
What would a pure functional, preferably lazy implementation of this look like? What features of Ramda could come in handy? Could I use lenses for this in some way? Function composition? Something else?
And is it ok to mix and match native map/reduce with ramda? Surely native calls are better than library invocations whenever possible?
One approach would be to create a lazy version of
R.reduceRight:This could then be used to create a function that will find the minimum element of a (non-empty) list, with a known lower bound:
If the lower bound is ever encountered, the recursion stops and returns that result immediately.
With
boundMinByavailable, we can create a lookup table of address types to sort order values:Along with a function that will produce a sort order value for a given address component:
And then we can compose it altogether with a pipeline such as:
R.chainis used above to concatenate the addresses of all places and filter out any addresses of places whereaddress_componentsis empty.I have included an example in the snippet below if you want to test it out with some data.