Some random videos and screenshots from things I've done for Igalia
either for work and training or for fun. It's mostly a selection of posts I've written
in my blog, in the past and can be found under the
My method to improve the upsampling of the SSAO render target of Iago Toral's Vulkan
I used the values of a downscaled normal buffer to classify the low resolution image pixel neighborhoods in two categories: neighborhoods where
all the pixels are part of the same surface and neighborhoods where there's a surface discontinuity. Then I used linear interpolation to upsample
the surface neighborhoods and a depth aware algorithm to upsample the neighborhoods that contain discontinuities. I used my own downsampling method
to select the most representative pixel of each neighborhood and normals instead of depths to classify the pixels. The algorithm seemed to significally
improve the visual results compared to 3 other depth aware downsampling
algorithms I tried (max depth, min depth, and min/max depth selected one after the other following a checkerboard pattern) that are used with depth-based classification.
An analysis of all the experiments I've done step by step and how I've came up with the idea of the final algorithm as well as
metrics and comparisons can be found in the series of blog posts below. Part 6 is a
complete description of the final method I used.
That was a demo I started when I joined Igalia. The purpose was to train myself in computer graphics. I've created a
procedural terrain with a number of cows on it, where the user can set the number of polygons, the size, and of course the number of cows.
I experimented with graphics concepts like environment mapping, image based lighting, morphing, surface subdivision and more and I
used a Poisson distribution based trick to uniformly spread the cows on the terrain). Unfortunately, I never finished the additional
Vulkan backend I wanted to implement due to lack of time.
Posts related to graphics drivers development (top ↑)
Posts related to contributions to graphics drivers and APIs as part of my work
for Igalia's graphics team).
OpenGL and Vulkan interoperability on Linux related posts. These are posts related to the implementation of
EXT_external_objects(_fd) extensions that allow OpenGL and Vulkan access to the same resources. As part of my work for
Igalia I've contributed to this extension for iris (patches), radeonsi (some fixes to the already implemented feature),
i965 (review), and I've written a Vulkan framework for piglit and tests to check several extension use cases in different
drivers (intel, amd, freedreno). The posts are mostly about these use cases:
extension allows an application to modify the locations of samples within a pixel used in rasterization.
Additionally, it allows applications to specify different sample locations for each pixel in a group of adjacent
pixels, which can increase antialiasing quality (particularly if a custom resolve shader is used that takes
advantage of these different locations). I've implemented it for the anv Vulkan driver.
an OpenGL shader viewer that can instantly show the changes (hobby project)
This was a project I started to save time to myself when I write shader tests.
When I started adding tests to piglit and vkrunner, I found
a bit frustrating to save the file, compile, then run the test and
check the result for errors. In case of vkrunner, the process was even
slower as I had to generate the SPIR-V and the result was rendered to a PPM
had to be checked with an external viewer.
To speed this process up
I've written a minimal pixel shader viewer that can instantly show the changes
in a shader file when this file is saved. The viewer only supports pixel shaders, but can read both GLSL pixel shaders
formatted shader tests that contain a pixel shader (the parser is quite minimal though but it does the job).
I now use this viewer when I need to write a pixel shader quickly, and as soon as
everything looks fine to me I add it to piglit or vkrunner. In case of vkrunner, I might have to take into
account the differences in the coordinate system sometimes but I still feel that development is faster this
an OpenGL/SPIRV example In Igalia, I've been working on GL and Vulkan extensions (eg VK_EXT_sample_locations, GL_EXT_external_objects, GL_EXT_external_objects_fd).
This post is from when I wanted to start contributing with tests and bug fixes to
the work for the ARB_gl_spirv extension that was written by my
colleagues. I wrote a small GL/SPIRV
example to understand how I should use the extension and
to familiarize myself with the team's work on it.
A hack to display the Vulkan CTS tests output An additional script/configuration I use to debug the Vulkan CTS (Khronos's conformance tests) output
quickly. It can instantly display the test result as if it was running on a window.
Demos inspired by conferences (These were all written for fun) (top ↑)
Adoption of ANGLE in WebKit on Linux
This talk was about the experiments I performed to find an optimal way to share texture data between the main WebKit
rendering process (that uses the native Linux OpenGL driver) and WebGL that uses ANGLE without copying them from one render
target to the other. I first tried to use shared EGL/OpenGL context between the main rendering process and the EGL
backend, and then Linux dma buffers. The second method had some advantages, as it could work with multiple processes and
it didn't require changes in the ANGLE backend but only some EGL extensions to be used in the drivers.