Git Product home page Git Product logo

lrucache's People

Contributors

chowells79 avatar k-bx avatar mstksg avatar pheaver avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

lrucache's Issues

Change `Maybe Integer` size restriction type to `Int`

Your LRU cache cannot store more than (maxBound :: Int) - 1 elements, as the corresponding memory requirements would exceed the range of memory that your CPU can address. Hence, you can use a size restriction of maxBound :: Int to simulate an unbounded (i.e., memory-bounded) LRU cache. However, I can see only very few use cases of a memory-bounded LRU cache, as I would normally use a LRU cache to avoid using all memory.

Therfore, I suggest changing the API to always require a size restriction of type Int and tell the user about using maxBound :: Int, if he wants a purely memory-bounded instance of the datastructure.

`insertInforming` is not working correctly

Running the following in ghci:

import Data.Cache.LRU
let l0 = newLRU (Just 2) :: LRU Int Int
let (l1, d1) = insertInforming 1 1 l0
let (l2, d2) = insertInforming 2 2 l1
let (l3, d3) = insertInforming 3 3 l2
print [(l1, d1), (l2, d2), (l3, d3)]

Results in:

[(fromList [(1,1)],Nothing),(fromList [(2,2),(1,1)],Nothing),(fromList [(3,3),(2,2)],Just (3,3))]

d3 should be Just (1, 1) rather than Just (3, 3). It seems that insertInforming is returning the input as the dropped element when the cache is full, while it is supposed to return the dropped out element. See https://github.com/chowells79/lrucache/blob/master/src/Data/Cache/LRU/Internal.hs#L146

In the case cache is not full, it is working fine.
Additionally, the IO module, do not expose an insertInforming function. I can create a PR for these changes if you are willing to merge it.

Make API fit canonical use in `Control.Monad.State`

Hi chowells79,

using an LRU cache is an inherently stateful operation. In user code, often a State monad is used to provide the necessary operation ordering information. The canonical way to lift your operations to the State monad would be to use the state operation provided by Control.Monad.State (http://hackage.haskell.org/packages/archive/transformers/latest/doc/html/Control-Monad-Trans-State-Lazy.html#v:state). This requires the returned tuple to be of the form (result, new_state). Hence, I suggest changing your functions to return their results in this form.

best regards,
Simon

undefined error when calling insert on GHC 7.6.1

I'm using:

ghc-7.6.1
array-0.4.0.1
deepseq-1.3.0.1
containers-0.5.0.0
lrucache-1.1.1.1

When I execute the file:

import qualified Data.Cache.LRU.IO as Lru

main :: IO ()
main = do
    lru <- Lru.newAtomicLRU (Just 10)

    _ <- Lru.lookup "foo" lru
    _ <- Lru.lookup "bar" lru

    Lru.insert "bar" "hi" lru
    Lru.insert "foo" "hi" lru

I get

Prelude.undefined

This is quite serious since it is affecting all hakyll users on GHC 7.6.1.

@A1kmm If you have some time, could you try to reproduce this, since you also appear to be using lrucache on ghc-7.6.1 (#6).

Possible performance optimizations

Hi chowells79,

a friend just pointed me to your library, as he was using it as a blueprint for implementing an Adaptive replacement cache. While investigating how your code works, I may have spotted a few possible performance optimizations for the pure version of the LRU cache:

  1. It seems that the Maybes for first and last in the datatype definition of LRU are used to handle the case of an empty cache. If you introduce a second constructor Empty, then you could simplify your cache definition and gain performance due to fewer indirections.
  2. You might want to consider using a circular doubly linked list. This way you would require only a pointer to first and the Maybes in LinkedValue could also be dropped. The last element of the list is then the one whose next key is equal to the first key of the cache. Again, getting rid of Maybes means getting rid of indirections and, thus, realizing a performance gain.
  3. You might want to consider storing the size-bound in just an Int. For an unlimited size, you would store maxBound :: Int. The reason is that you will never be able to store more than maxBound :: Int elements in your LRU cache, as this would exceed the range of memory addressable by your CPU. Again this improves performance, as indirections and code complexity is reduced.
  4. You do not require the property of Maps that they keep their keys in an ordered sequence. You could therefore switch to using a persistent hashtable (analogous Johan Tibell's unordered-containers package). I do not see a way to implement it using the public API of unordered-containers. The problem is that the value you store in the underlying IntMap depends on the hash of the key. The performance gain would stem from two sources: first, fewer indirections, because the next and prev "pointers" can then be fully unpacked in the LinkedVal and, second, hashing + IntMap operations are faster than using the comparison operation on a key log-times often. See Johan's post here: http://blog.johantibell.com/2011/11/slides-from-my-guest-lecture-at.html.

Happy haskell hacking,
Simon

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.