Git Product home page Git Product logo

Comments (11)

orkolorko avatar orkolorko commented on May 29, 2024 1

I wrote some simple code calling the intrinsics and packaged it:

https://github.com/orkolorko/SetRoundingLLVM.jl

It is a workaround until setrounding is fixed in the main Julia codebase.

@dpsanders I checked and using the llvm intrinsic directed rounding works also on Windows and Mac Os (I added some simple tests checking the directed rounding works)

from intervallinearalgebra.jl.

orkolorko avatar orkolorko commented on May 29, 2024

Hi @lucaferranti, I was thinking in looking into this. Do you still think the best approach would be to define an IntervalMatrix type?

from intervallinearalgebra.jl.

orkolorko avatar orkolorko commented on May 29, 2024

Moreover, there is another issue, since

JuliaLang/julia#27166

setrounding for Float64 was deprecated, which leaves us in uncharted seas...

from intervallinearalgebra.jl.

dpsanders avatar dpsanders commented on May 29, 2024

https://github.com/matsueushi/RoundingEmulator.jl replaces setrounding for Float64. That's what we use in IntervalArithmetic.jl

from intervallinearalgebra.jl.

orkolorko avatar orkolorko commented on May 29, 2024

@dpsanders thank you, I will look into it.

I am thinking to change the rounding modes from C, using ccall and Glibc (thus restricting the package to Linux): the main issue is that the implementation of the matrix product in the library is fast because it relies on LAPACK with directed rounding.

I'm checking if some other work of the Waseda group if they have alternatives to this, but if we want to use the high performance matrix multiplication we need to change rounding mode on the processor.

from intervallinearalgebra.jl.

dpsanders avatar dpsanders commented on May 29, 2024

Ah I see. Unfortunately changing the rounding mode is complicated due to the way that LLVM works, which is why setrounding was removed for Float64. (Basically it does a premature optimization that fails to take account of the rounding mode.) This may have been fixed since the last time I looked into it, though, since that was a couple of years ago. You should be able to find issues on the JuliaLang GitHub about this.

from intervallinearalgebra.jl.

orkolorko avatar orkolorko commented on May 29, 2024

@dpsanders It seems like llvm now has an intrinsic for setting round modes,
https://reviews.llvm.org/D74729
but may be beyond my ability to implement code using it.
I will try to play around with it.

12:21 - It was not too difficult to call the intrinsic, so, I have working code that changes the rounding modes, I'm writing some more tests and some macros

from intervallinearalgebra.jl.

lucaferranti avatar lucaferranti commented on May 29, 2024

(sorry for the radio silence, I'll come back to this this (European time) evening)

from intervallinearalgebra.jl.

lucaferranti avatar lucaferranti commented on May 29, 2024

Hi @orkolorko thanks for the interest in the package!

Do you still think the best approach would be to define an IntervalMatrix type?
I am pretty sure the current * is type piracy and should be changed. The options are

  1. Define IntervalMatrix type. This is also appealing because I think in several linear algebra applications (multiplication, eigenvalues) one would benefit from midpoint-radius representation and this would give us the freedom of representing interval matrices that way (although this is arguably a little unorthodox from a traditional interval arithmetic perspective). There is also IntervalMatrices.jl which defines that, but I am not convinced about adding it as dependency.
  2. Use a different operator for interval matrix multiplication.

As David pointed out above, setrounding(Float64) was removed from Julia base and SetRounding.jl basically copy-pasted that part of code into its own package for our use.

The idea of Rump fast matrix multiplication relies on reducing it to floating point matrix multiplications. Using RoundingEmulator.jl would effectively undo this.

I am no an expert of LLVM, but I was also under the impression that in more recent versions changing rounding mode safely(?) should be possible. If you managed to implement it, that is absolutely fantastic!!

Just a small notice, if what you are doing actually works, I think it would broadly affect all packages in JuliaIntervals, not just IntervalLinearAlgebra. I am very interested in following the development of that. Would be particularly interesting to see how that compares to 1) the current use of SetRounding.jl 2) the use of RoundingEmulator.jl . Those would be very valuable and your work could replace SetRounding.jl in IntervalArithmetic.jl if it works. I can help you draft some tests and benchmarks to check the LLVM approach. This maybe goes beyond the original purpose of this issue though? What about opening an issue in SetRoundingLLVM to discuss how to benchmark and test it? :)

from intervallinearalgebra.jl.

orkolorko avatar orkolorko commented on May 29, 2024

Hi @lucaferranti, I invited you to SetRoundingLLVM and opened an issue there.
About Rump matrix multiplication:

  • I think midpoint-radius for matrices makes sense; moreover, I think that faster complex linear algebra routines are possible if we work with complex balls (if I remember well, this is already in Rump original paper)
  • I'm testing with implementing other algorithms from Ozaki, Ogita, Rump, Oishi, Fast algorithms for floating-point interval matrix multiplication; I think it may be worth to have them implemented in the package, even if not used as the standard algorithm

I think converting to midpoint radius is a really good idea; I can start working on it on a refactor branch if we agree on it (I need complex matrix multiplication anyway for some other work I'm doing...)

from intervallinearalgebra.jl.

lucaferranti avatar lucaferranti commented on May 29, 2024

I'm testing with implementing other algorithms from Ozaki, Ogita, Rump, Oishi, Fast algorithms for floating-point interval matrix multiplication; I think it may be worth to have them implemented in the package, even if not used as the standard algorithm

I have to confess, when I read the paper last year I was not super convinced by the results hence the algorithms didn't make it to my todo list. Still, I agree it would be valuable to have them available, at least for reproducing the results and benchmarking

from intervallinearalgebra.jl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.