jlchan / fluxdiffutils.jl Goto Github PK
View Code? Open in Web Editor NEWUtilities for performing flux differencing and computing Jacobian matrices for entropy stable summation-by-parts methods.
License: MIT License
Utilities for performing flux differencing and computing Jacobian matrices for entropy stable summation-by-parts methods.
License: MIT License
Would like the matrices returned by the hadamard_*
functions (in particular hadamard_jacobian
) to be compatible with the vector of vectors, U
, used to construct them.
Should write directly to nzval in Jacobian blocks instead of using A[i,j]
access. The latter is slow, see MWE
function copy_sparse!(A,B)
rows = rowvals(B)
vals = nonzeros(B)
for j = 1:size(B,2)
for row_id in nzrange(B,j)
i = rows[row_id]
# modify B_ij = f(B_ij,i,j) ...
A[i,j] = vals[row_id]
# A.nzval[row_id] = vals[row_id] # faster
end
end
end
Running with N=1000
matrices (both dense/sparse) shows sparse writes are much slower.
julia> @btime copy_sparse!($Asparse,$B);
11.801 μs (0 allocations: 0 bytes)
julia> @btime copy_sparse!($Adense,$B);
2.049 μs (0 allocations: 0 bytes)
Fix so that users don't need to specify "scale = +/- 1", just the matrix type (symmetric, skew-symmetric, or general).
Line 75 of hadamard_jacobian.jl causes the function hadamard_jacobian(A_list, product_type...)
to be type unstable. The instability can be recreated with the following snippet of code:
function makeBlockJ(U)
Nfields = length(U)
n = length(U[1])
Z = spzeros(n,n)
J = SMatrix{Nfields,Nfields}( [Z for i in 1:Nfields, j in 1:Nfields])
end
@code_warntype makeBlockJ(U)
This issue is used to trigger TagBot; feel free to unsubscribe.
If you haven't already, you should update your TagBot.yml
to include issue comment triggers.
Please see this post on Discourse for instructions and more details.
If you'd like for me to do this for you, comment TagBot fix
on this issue.
I'll open a PR within a few hours, please be patient!
Passing in sparse matrices to hadamard_sum_ATr!
seems slower than passing in dense matrices.
Thinking of changing name to FluxDiffUtils.jl, since the new hadamard_sum
for dense operators is general purpose.
Matrices are no longer sparse after the broadcasted call totranspose
in hadamard_sum
and hadamard_sum!(rhs, A_list,...)
.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.