google / ffcc Goto Github PK
View Code? Open in Web Editor NEWFast Fourier Color Constancy: an auto white balance solution with machine learning in Fourier space
Home Page: https://jonbarron.info/
License: Apache License 2.0
Fast Fourier Color Constancy: an auto white balance solution with machine learning in Fourier space
Home Page: https://jonbarron.info/
License: Apache License 2.0
Thank you for providing the Gehler-Shi data (your repo may be the last public source of it!). The Gehler-Shi website is no longer providing zip links.
When I reviewed the code in internal/TrainModel.m, line 283 ~ 288, the original codes are as below:
if params.DEEP.WHITEN_FEATURES
% Unwhiten the first layer according to the whitening transformation, so
% that is produces the correct output on the unwhitened feature vectors.
model.W{1} = model.W{1} * whitening_transformation.A;
model.b{1} = model.W{1} * whitening_transformation.b + model.b{1};
end
Since model.W{1} changed before model.b{1} it will lead a computational mistake to model.b{1}.And this mistake may weaken the model performance.
So I simply change the code order as below:
model.b{1} = model.W{1} * whitening_transformation.b + model.b{1};
model.W{1} = model.W{1} * whitening_transformation.A;
And after this change, I got a significant improvement in my experiment.^.^
hi,
@jonbarron, thanks for sharing the codes of the ffcc. I can't understand the meaning of the conv operations in the function "RenderHistorgramGaussian". The code is as follow,
% Threshold the mahalanobis distance at 3 to make a binary mask, which is
% dilated to produce an ellipse with a dot in the center.
mask = mahal_dist <= 3;
prediction = (conv2(double(mask), ones(3,3), 'same') > 0) & ~mask;
prediction = prediction | (conv2(double(mahal_dist == min(mahal_dist(:))), [0, 1, 0; 1, 1, 1; 0, 1, 0], 'same') > 0);
% Optionally create a mask with the ground-truth white point rendered as a dot.
if ~isempty(mu_true)
D = (us - mu_true(1)).^2 + (vs - mu_true(2)).^2;
truth = D == min(D(:));
truth = (conv2(double(truth), [0, 1, 0; 1, 1, 1; 0, 1, 0], 'same') > 0);
truth = (conv2(double(truth), [0, 1, 0; 1, 1, 1; 0, 1, 0], 'same') > 0);
I want to ask the questions,
Look forward to your favourable reply !
@jonbarron
hi,
When i set the params.DEEP.ENABLED true, it apperas the bug about the length(params.DEEP.FEATURE_FILENAME_TAGS)
. The define of the params.DEEP.FEATURE_FILENAME_TAGS is 'feature'.
Hi,
@jonbarron , thanks for sharing the ffcc project.
I am researching the ffcc project and have met two problems as follows.
First, I can not understand why use the MaskedLocalAbsoluteDeviation to compute the im_channels{2}, and i debug the computational process of the im_channel.
function im_channels = ChannelizeImage(im, mask)
% Generate feature images (color and gradient) and combine them into
% different cell channels.
assert(isa(mask, 'logical'));
im_channels = {};
im_channels{1} = cast(bsxfun(@times, double(im), mask), 'like', im);
im_channels{2} = cast(MaskedLocalAbsoluteDeviation(im, mask), 'like', im);
im_channels = ChannelizeImage(im, mask); % 256*384*3, 256*384*3
...... % histrogram feature
X = Xc ; % 64*64*2
...... % get the fourier transformation(F_fft and X_fft) of the model param and X,
FX_fft = sum(bsxfun(@times, X_fft, F_fft), 3)
Second, how to understand the computational preconditioner.F_fft * model_precond.F_fft to get the model.F_fft. Why not just use the variable model_precond as model.F_fft. I disconver the values of the variable preconditioner.F_fft are approximate zeros.
% the computational process of the preconditioner.F_fft
u_variation_fft = abs(fft2([-1; 1]/sqrt(8), X_sz(1), X_sz(2))).^2;
v_variation_fft = abs(fft2([-1, 1]/sqrt(8), X_sz(1), X_sz(2))).^2;
total_variation_fft = u_variation_fft + v_variation_fft;
% A helper function for applying a scale and shift to a stack of images.
apply_scale_shift = @(x, m, b)bsxfun(@plus, bsxfun(@times, x, permute(m(:), [2,3,1])), permute(b(:), [2,3,1]));
@jonbarron,i will be looking forward to your reply!Thanks!
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.