This project is a parameterized Chisel generator for image processing hardware. The top-level hardware module can take a number of pixels every cycle, apply the given filter on the image, and return the processed pixels in a similar streamlined fashion.
- Support for two types of image filters:
- Basic (pixel-level)
- Convolution (kernel-based)
- Parallelism
- Implemented filters:
- Blur
- Bump
- Grayscale
- Solarize
- Generic implementation to support new filters
In order to instantiate an image processor, you need to use our generator interface.
val p = ImageProcessorParams(...) // Image processor parameters
val filterName = ... // Name of the filter
val imageProcessor = ImageProcessorGenerator.get(p, filterName) // Image processor instance
If you would like to add your own filter, you need to do two things. First, add your filter logic in a class like the following.
class MyFilter(p: ImageProcessorParams) extends FilterOperator(p, ???, ???) {
// Your filter logic here (see FilterOperator.scala for implementation examples)
}
Second, add an entry to the filter generator interface. Also, make sure that FilterGenerator.isKernelFilter
returns true
if your filter is a convolution (kernel-based) filter. You can extend the Vector in that function with your filter's name.
object FilterGenerator {
...
val myfilter = "myfilter"
...
def get(p: ImageProcessorParams, name: String): FilterOperator = {
...
} else if (name == myfilter) {
return new MyFilter(p)
}
...
}
}
- Run sbt test.
- The tester will take
sample.png
from the image directory and create output of each filter in temp directory from the library and the processor.
- Add more filters to match their equivalent from the library.
- After parameterizing the pipeline, parallelize filter application so that the pipeline won't have to stall.
- Parameterize the pipeline so that the processor can take a variable number of pixels every cycle.
- Find a strategy for edge pixels.
- Since edge pixels are missing some neighbor pixels, kernel becomes off the image.
- Currently the processor doesn't apply filters on edge pixels since we haven't decided how to handle them.
- We can treat those missing pixels as empty, duplicate pixels from the nearest edge, or wrap pixels from the other side.
- Add support for non-convolutional (no kernel) image filters.
- We can create two FSMs for two types of image processors: (
BasicImageProcessor
andKernelImageProcessor
) - These two types of image processors would inherit from a parent class
ImageProcessor
that also has common logic. - The user interface would be unaware of this difference and use a generator, or instantiate the
ImageProcessor
class only. - The logic of the
BasicImageProcessor
would be simpler and it can immediately output the new pixel (no need for row buffers).
- We can create two FSMs for two types of image processors: (
- Check if output pixels match the library in unit tests.
- Add FSM for a filter that uses a kernel.
- Implement simple image processor logic without any filtering (output the same image).
- Add
ImageProcessorModel
to model the behavior of hardware using a library.- Also add unit tests for this model.
- The model should read/write image files and prepare input pixels.
- Add a simple image for unit tests.
- We're using the Scrimage library for unit tests. Because of floating-point precision issues, rounding pixel values can cause incorrect calculations. Therefore, we're testing pixel values with a tolerance of being off by 1.
- Image processor implementation assumes that the convolution filters' kernel size is 3x3. Some parts of the implementation were written parameterized, but we'll need to change row buffer usage if there is a need for larger kernel sizes.