{ "name": "reducers", "id": "reducers", "version": "3.0.0-alpha", "description": "Library for higher-order manipulation of collections", "keywords": [ "reducers", "reducible", "reduce", "data", "sequence", "stream", "collection", "transformation", "composable", "async", "signal", "manipulation" ], "author": { "name": "Irakli Gozalishvili", "email": "rfobic@gmail.com", "url": "http://jeditoolkit.com" }, "homepage": "https://github.com/Gozala/reducers", "repository": { "type": "git", "url": "https://github.com/Gozala/reducers.git", "web": "https://github.com/Gozala/reducers" }, "bugs": { "url": "http://github.com/Gozala/reducers/issues/" }, "dependencies": { "reducible": "~1.x.0" }, "devDependencies": { "test": "~0.5.2", "repl-utils": "~2.0.1", "benchmark": ">=0.3.0", "phantomify": "~0.x.0" }, "scripts": { "test": "npm run test-node && npm run test-browser", "test-browser": "node ./node_modules/phantomify/bin/cmd.js ./test/index.js", "test-node": "node ./test/index.js", "repl": "node node_modules/repl-utils", "benchmark": "node test/benchmark.js" }, "licenses": [ { "type": "MIT", "url": "https://github.com/Gozala/reducers/License.md" } ], "readme": "# reducers\n\n[![Build Status](https://secure.travis-ci.org/Gozala/reducers.png)](http://travis-ci.org/Gozala/reducers)\n\nLibrary for higher-order manipulation of collections, based upon [reduce][].\n\n## Rationale\n\nMost functional languages (including beloved JS) typically come with some\ncollection transformation functions like [filter][] and [map][] that take a\nlogical collections and return transformed version of it. Unfortunately they\ntend to [complect][], by implying mechanism, order, laziness and\nrepresentation. This library is an attempt to provide simple solution for\nsome of the hard problems by decomplecting and building upon simple premise -\nminimum definition of collection is something that is reducible.\n\nMore specifically library defines super-generalized and minimal abstraction for\ncollections - a collection is some set of things that, when given a function to\napply to its contents, can do so and give you the result, i.e. a collection is\n(at minimum) **reducible**. In other words, you can call `reduce` on it.\n\nA very minimal abstraction for collection is more powerful than it may seem at\nfirst!\n\n## Basics\n\nDemonstration of features of this library requires some basic understanding of\nthe abstraction above. So let's take a more practical look at the idea. Let's\nsay we have a `reduce` function with *(very familiar)* API:\n\n```js\nreduce(source, f, initial) // => accumulated result\n```\n\nIt takes reducing function, a reducible `source` and `initial` value to\naccumulate reductions upon. In return it outputs an accumulated result.\nReducing functions performing accumulation have a following shape:\n\n```js\nf(result, value) // => new result\n```\n\nA reducing function is simply a binary function, akin to the one you might pass\nto reduce. While the two arguments might be treated symmetrically by the\nfunction, there is an implied semantic that distinguishes the arguments:\nthe first argument is a `result` or accumulator that is being built up by the\nreduction, while the second is some new input `value` from the source being\nreduced.\n\n## Transformations\n\nAll of the collection operations can be expressed in terms of transformations.\nBy the definition all transformations will produce **reducible** collections\nthat can be reduced via `reduce` function defined above:\n\n```js\nmap(source, JSON.parse) // => reducible collection\nfilter(numbers, isEven) // => reducible collection\n```\n\nIn order to explain transformations we'll need a primitive API for producing\n**reducible** collections. Let's define one in form of `reducible` function\nthat takes `accumulator` function and returns something that can be reduced\nvia `reduce` function:\n\n\n```js\nreducible(accumulator) // => reducible\n```\n\nArgument it takes, `accumulator` is a function that performs has following shape:\n\n```js\naccumulate(next, initial) // => accumulated result\n```\n\nAnd when invoked it performs reductions via `next` reducing function starting\nfrom `initial` result.\n\n\nNow consider following implementation of `map` & `filter` transformation\nfunctions:\n\n```js\nfunction map(f, source) {\n return reducible(function accumulator(next, initial) {\n return reduce(source, function reducer(result, input) {\n return next(result, f(input))\n }, initial)\n })\n}\n\nfunction filter(predicate, source) {\n return reducible(function accumulator(next, initial) {\n return reduce(source, function reducer(result, input) {\n return predicate(input) ? next(result, input) : result\n }, initial)\n })\n}\n```\n\nThere are a few things to note here:\n\n - Type of the source is irrelevant as long as it is reducible and there for\n can be reduced via `reduce` function.\n - Transformations do not traverse collections, instead they compose results\n that can be reduced by a receiver of the result later.\n - Transformations do not imply timing in which `reducer` in invoked with an\n each `input` of the `source`, there for `source` can be asynchronous.\n - Filtering can *skip* inputs by simply returning the incoming result.\n\n\n## Features\n\n### Laziness\n\nLibrary consists of transformation functions which, as seen above, when called\ndo nothing except the creation of a recipe for a new collection, a recipe that\nis itself reducible. No work is done yet to the contained elements and no\nconcrete collection is produced. All the transformations defer actual work\nto a point where result of transformations pipeline is being reduced.\n\nThe beautiful thing is that this mechanism also works for all other traditional\ntransformations `take`, `drop`, `merge` etc. Note the fact that `filter` is\n(potentially) contractive, and flatten is (potentially) expansive per step -\nthe mechanism is general and not limited to 1:1 transformations.\n\n### Uniformity\n\nTransformation functions are absolutely agnostic of the actual type of the\n`source`, as they just describe transformations and leave it up to `source`\nto do a reduction when result is consumed.\n\nLibrary takes a advantage of this feature and takes it even step further by\ntreating every possible value as a reducible collection. Non collection values\nlike numbers, booleans, objects etc. are treated as collection of single item,\nitem being a value. Also `null` and `undefined` are considered as empty\ncollections.\n\nThis means that library can be used on any data type and more importantly\ntransformations between different data types & compose naturally, which is\ngreat, let's you define logic in terms of abstractions instead of specific\ntypes.\n\n### Composability\n\nAll the transformations are fully composable as a matter of fact transformation\npipelines produce compositions equivalent of a function compositions created by\na [compose][]. Also type agnostic nature of the transformation functions enables\ncompositions between different types of data.\n\n### Performance\n\nSince transformations doesn't do the work, but merely create a recipe, there is\nno per-step allocation overhead, so it's faster. Also note that transformations\nare composed by curring transformation functions and all the actual work happens\nin a pipe line at the end when result is consumed, which means that no\nintermediate collections are produced, unlike it's a case with arrays etc..\n\nThink [monad][] & [category theory][] if you fancy that.\n\nIt can even [outperform arrays][benchmarks] when used wisely, although it's not\nthe point & arrays are not the primary use case.\n\n### Asynchronicity\n\nAs it was already pointed out transformation functions do not imply any timing\nof individual value delivery, which means they can be used on asynchronous\ndata structures like [node streams][stream-reduce] or [FRP][] events & signals.\n\nThis feature is extremely powerful as it allows structuring complex asynchronous\nprograms in simple intuitive code without a [callback hell][] and manual error\npropagation. _See [lstree][] for examples_.\n\nEven better actually exact same code can be used with both synchronous and\nasynchronous data structures. For example exact same code in [fs-reduce][]\ncan be forced to do blocking IO by via `options.sync` option.\n\n### Extensibility\n\nSince transformations are `source` type agnostic it's highly extensible. In\nfact implementation is based of polymorphic [method][] dispatch library and\nenables one to add support for new data types without any changes to this\nlibrary or data types / classes them self. This feature is used by\n[stream-reduce][] library to add support for node streams. There are more\nexamples of this feature in [callback-reduce][], [dom-reduce][],\n[http-reduce][]...\n\nVery likely all data types like `signal` provided by this library will be move\nout into own libraries too.\n\n### Automatic disposal\n\nReducible data structures feature auto cleanup of the resources at the end of\nconsumption. For example [dom-reduce][] and [fs-reduce][] use this feature to\nremove event listeners / close file descriptors once input is consumed and to\nset you free from clean up constraints. This means you spend more time on\nactual problems rather and less on plumbing.\n\n### Infinity\n\nInfinite data structures can be trivially represented via reducibles since\nnothing implies the end. In fact [dom-reduce][] uses this feature to represent\nuser events in form of reducibles that pretty much can be infinite.\n\nThat being said reducibles are not the best abstraction for the some types of\ninfinite data structures specially ones that rather better be polled instead.\n\n## F.A.Q.\n\n\n##### 1. Q: Can this handle \"back pressure\" ? \n \n**A:** Short answer is **Yes**.\n\nSee [IO Coordination] for more detailed answer\n\n\n\n\n## Install\n\n npm install reducers\n\n## Prior art\n\n- [Clojure reducers][]\n- [Haskell Enumerator/Iteratee][]\n\n[Clojure reducers]:http://clojure.com/blog/2012/05/15/anatomy-of-reducer.html\n[Haskell Enumerator/Iteratee]:http://www.haskell.org/haskellwiki/Enumerator_and_iteratee\n\n[reduce]:http://en.wikipedia.org/wiki/Reduce_%28higher-order_function%29\n[map reduce]:http://en.wikipedia.org/wiki/MapReduce\n[map]:https://developer.mozilla.org/en-US/docs/JavaScript/Reference/Global_Objects/Array/map\n[filter]:https://developer.mozilla.org/en-US/docs/JavaScript/Reference/Global_Objects/Array/filter\n[Uniformity]:http://en.wikipedia.org/wiki/Uniformity_%28complexity%29#Uniformity\n[complect]:http://www.infoq.com/presentations/Simple-Made-Easy\n[compose]:http://underscorejs.org/#compose\n[monad]:http://en.wikipedia.org/wiki/Monad_%28category_theory%29\n[Category theory]:http://en.wikipedia.org/wiki/Category_theory]\n[benchmarks]:http://jsperf.com/reducibles/4\n[stream-reduce]:https://github.com/Gozala/stream-reduce\n[FRP]:http://en.wikipedia.org/wiki/Functional_reactive_programming\n[method]:https://github.com/Gozala/method\n[callback-reduce]:https://github.com/Gozala/callback-reduce\n[dom-reduce]:https://github.com/Gozala/dom-reduce\n[http-reduce]:https://github.com/Gozala/http-reduce\n[callback hell]:http://callbackhell.com/\n[fs-reduce]:https://github.com/Gozala/fs-reduce\n[lstree]:https://github.com/Gozala/callback-reduce\n\n[IO Coordination]:https://github.com/Gozala/reducers/wiki/IO-Coordination\n", "readmeFilename": "Readme.md", "_id": "reducers@3.0.0-alpha", "_from": "reducers" }