jamy-l / handheld-multi-frame-super-resolution Goto Github PK
View Code? Open in Web Editor NEWHandheld Multi-image Super-resolution [Wronski et al., SIGGRAPH19]. Non-official GPU-supported Python implementation.
License: MIT License
Handheld Multi-image Super-resolution [Wronski et al., SIGGRAPH19]. Non-official GPU-supported Python implementation.
License: MIT License
Hi!
I'm trying to use this method for my own photos in .dng format, but i have
Traceback (most recent call last):
File "/home/jovyan/fominaav/Multi-frame-SR/Handheld-Multi-Frame-Super-Resolution/run_handheld.py", line 176, in
handheld_output = process(args.impath, options, params)
File "/home/jovyan/fominaav/Multi-frame-SR/Handheld-Multi-Frame-Super-Resolution/handheld_super_resolution/super_resolution.py", line 325, in process
ref_raw, raw_comp, ISO, tags, CFA, xyz2cam, ref_path = load_dng_burst(burst_path)
File "/home/jovyan/fominaav/Multi-frame-SR/Handheld-Multi-Frame-Super-Resolution/handheld_super_resolution/utils_dng.py", line 85, in load_dng_burst
white_level = tags['Image Tag 0xC61D'].values[0] # there is only one white level
KeyError: 'Image Tag 0xC61D'
I checked the tags in your .dng-s and mine - they are different. Is there amy tool to fix it?
Hello Jamy
This method doesn't give alpha and beta parameters, but gives mean curve and std curve instead.
In run_fast_MC method alpha and beta parameters are used in calculation of std curve and mean difference curve if I understand right.
Can I use mean and std curves ? Magnitudes and number of sampling points on the curves that are returned by run_fast_MC are wildly different.
linalg.py line157 e2[0] = 0; e2[0] = 1, should be e2[0] = 0; e2[1] = 1 .
Hi this repo contains really nice and detailed implementation and explanation. I have a more methodology question: I'm reading your IPOL paper, and judging from eq1, it appears the registration algorithms assume no rotational mismatch between blocks from different frames. Could you please kindly confirm that, or am I missing something? Thank you.
Hello, I noticed that when running your program, the colors in the generated results appear different from the low-resolution images. I'm not sure if this is normal. I have tried adjusting various parameters, but I couldn't match the colors with the low-resolution photos.
I have attached my results. The left side is the low-resolution first frame, the middle is the super-resolution result with color correction, and the right side is the super-resolution result without color correction.
Thanks for the nice work. It looks like the assumption of a standard 2x2 Bayer filter is assumed. Are there any plans to relax that assumption to work with other configurations such as Quad-CFA?
Some mobile phones (e.g. Xiaomi Mi 10 Ultra) are now equipped with such sensors.
Single samples (no burst) can be found here:
darktable-org/rawspeed#256
https://drive.google.com/drive/folders/1EPZEpvBHupHehsYI600f9cS7m_MaVhSQ?usp=sharing
Hi!
I'm trying to use this method with images i have only in PNG format. Is there any way to do this?
Hi Jamy,
when in the flat field, after using the kernel did you find it blur ? And how did you solve it?Thanks.
I use vscode remote ubuntu server to debug, but can't debug @cuda.jit function.
Would be great to have an option to generate a 16-bit processed LinearRaw DNG for further manual tonemapping. I briefly grepped the code and see that you are using int8 during the processing so not sure if it would be easy to rewrite the program for the stated goal. Regarding the file metadata, you would probably want to still apply the white balance and change the coefficients in the output file while not doing any color transforms and just passing those and other tags to the new DNG. These use exiftool and could be helpful:
https://github.com/gluijk/dng-from-tiff/blob/main/dngmaker.bat https://github.com/antonwolf/dng_stacker/blob/master/dng_stacker.bat
I'm getting this output and error both when I try to run your code on my local machine and when I try to run your online demo.
RAW images are taken with a Galaxy S22+
DR=dr-limule-docker-gpu
Non-zero exit code (1): Parameters:
Upscaling factor: 2
Super-resolution with 25 images.
Robustness: enabled
-------------------------
t: 0.12
s1: 2.00
s2: 12.00
Mt: 0.80
Robustness denoising: enabled
Alignment:
-------------------------
ICA Iterations: 3
Fusion:
-------------------------
Kernel shape: handheld
k_stretch: 4.00
k_shrink: 2.00
k_detail: SNR based
k_denoise: SNR based
Processing with handheld super-resolution
Traceback (most recent call last):
File "/workdir/bin//demo.py", line 185, in
handheld_output, debug_dict = process(args.impath, options, params)
File "/workdir/bin/handheld_super_resolution/super_resolution.py", line 360, in process
black_levels = np.array([int(x.decimal()) for x in black_levels.values])
File "/workdir/bin/handheld_super_resolution/super_resolution.py", line 360, in
black_levels = np.array([int(x.decimal()) for x in black_levels.values])
AttributeError: 'int' object has no attribute 'decimal'
I'm using raw images captured on a Hikrobot machine vision camera in bayer rg8 pixel format, then converting them to dng and adding exif data myself. The dng images look totally normal, but the output I get from running them this model is all green. Is there a parameter I can change somewhere or something else I can do to get the output images back to normal?
Your project is amazing by the way!
Hello, I have a question to ask. I have downloaded the Samsung dataset and obtained low-resolution images, and the program runs successfully. However, I have been searching for a long time but cannot find the high-resolution images to compare the results and calculate PSNR and SSIM. I appreciate any help.
Thank you so much.
Hi Jamy:
I try to run this project with my raw images, the output image is blur and has some artifacts; Is the problem of params alpha/beta?
How can i match them with this project? How to calibrate the alpha/beta params?
Looking forward to your reply.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.