Git Product home page Git Product logo

Comments (4)

fjetter avatar fjetter commented on May 28, 2024

I strong recommend to not pass the computed array darr.compute() to pairwise_distance. It should just be darr. This will otherwise materialize the entire array into memory and will require you to send all of this over the network.

I suspect that the bug you are running into is actually already fixed in dask/distributed#8507 but you wouldn't have a good time submitting 12TB from your client to the scheduler.

from dask.

lsc64 avatar lsc64 commented on May 28, 2024

I strong recommend to not pass the computed array darr.compute() to pairwise_distance. It should just be darr. This will otherwise materialize the entire array into memory and will require you to send all of this over the network.

I suspect that the bug you are running into is actually already fixed in dask/distributed#8507 but you wouldn't have a good time submitting 12TB from your client to the scheduler.

Unfortunately that functions requires a dask.array and a numpy.array, otherwise it would of course be nicer to not do that.
https://github.com/dask/dask-ml/blob/b95ba909c6dcd37c566f5193ba0b918396edaaee/dask_ml/metrics/pairwise.py#L58

if isinstance(Y, da.Array):
        raise TypeError("`Y` must be a numpy array")

If I batch the materialized array into 100k slices (which reduces the graph size) it works, so you're probably right!

hists = []
batch_size = 100000
for batch in tqdm(range(darr.shape[0] // batch_size)):
    distances = pairwise_distances(
        darr,
        darr[
            batch * batch_size : min((batch + 1) * batch_size, darr.shape[0])
        ].compute(),
        metric="cosine",
    )
    hist, bins = da.histogram(distances, bins=100, range=[0, 2])
    hists.append(hist)
da.compute(hists) # works, still computes everything at once

Do I have the patch if I install from source?

from dask.

fjetter avatar fjetter commented on May 28, 2024

Unfortunately that functions requires a dask.array and a numpy.array,

Sorry, I missed that. I haven't tried to understand your batching code to ensure if it is correct. If it is, maybe you want to contribute this to dask-ml because a "proper" dask algorithm works similarly. I don't know enough about the pairwise_distance algorithm to tell

However, what I can tell you is that if you include a 12TB array in the map_blocks call of https://github.com/dask/dask-ml/blob/b95ba909c6dcd37c566f5193ba0b918396edaaee/dask_ml/metrics/pairwise.py#L60-L67 this will replicate that array to the scheduler and all dask workers. I doubt this is what you want to do.

Do I have the patch if I install from source?

I just checked and this was already released in 2024.2.1 (the version you are running on). By breaking up the array you are avoiding all sorts of problems so if this is possible, go for it.

from dask.

lsc64 avatar lsc64 commented on May 28, 2024

Sorry, I missed that. I haven't tried to understand your batching code to ensure if it is correct. If it is, maybe you want to contribute this to dask-ml because a "proper" dask algorithm works similarly. I don't know enough about the pairwise_distance algorithm to tell

No worries, I just like to leave code snippets in case anyone has the same issue, so they're not faced the unhelpful "nvm I solved it". I can open a PR at some point and discuss this over there.

However, what I can tell you is that if you include a 12TB array in the map_blocks call of https://github.com/dask/dask-ml/blob/b95ba909c6dcd37c566f5193ba0b918396edaaee/dask_ml/metrics/pairwise.py#L60-L67 this will replicate that array to the scheduler and all dask workers. I doubt this is what you want to do.

For X.map_blocks(fn, Y) the array Y gets fully replicated, X or both? But that array/those arrays are not materialized right?
The docs give an example, which is exactly what I/the function wants to achieve (just for huge arrays and lambda a,b,: distance(a,b))

d = da.arange(5, chunks=2)
e = da.arange(5, chunks=2)
f = da.map_blocks(lambda a, b: a + b**2, d, e)
f.compute()

I just checked and this was already released in 2024.2.1 (the version you are running on). By breaking up the array you are avoiding all sorts of problems so if this is possible, go for it.

We need bigger graphs!! /s (but maybe actually)

from dask.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.