Code for Igalia

Some random videos and screenshots from things I've done for Igalia either for work and training or for fun. It's mostly a selection of posts I've written in my blog, in the past and can be found under the Igalia category.

Computer graphics in OpenGL and Vulkan  (top ↑)

The algorithm applied
in Iago Toral's SSAO.
An algorithm to upscale lower resolution textures (Vulkan)

My method to improve the upsampling of the SSAO render target of Iago Toral's Vulkan vkdf demo.

I used the values of a downscaled normal buffer to classify the low resolution image pixel neighborhoods in two categories: neighborhoods where all the pixels are part of the same surface and neighborhoods where there's a surface discontinuity. Then I used linear interpolation to upsample the surface neighborhoods and a depth aware algorithm to upsample the neighborhoods that contain discontinuities. I used my own downsampling method to select the most representative pixel of each neighborhood and normals instead of depths to classify the pixels. The algorithm seemed to significally improve the visual results compared to 3 other depth aware downsampling algorithms I tried (max depth, min depth, and min/max depth selected one after the other following a checkerboard pattern) that are used with depth-based classification. An analysis of all the experiments I've done step by step and how I've came up with the idea of the final algorithm as well as metrics and comparisons can be found in the series of blog posts below. Part 6 is a complete description of the final method I used.

Depth aware upsampling experiments blog posts:
Terrain with cows.
A procedural terrain (OpenGL)

That was an OpenGL demo I started when I joined Igalia. I've created a procedural terrain with a number of cows on it, where the user can set the number of polygons, the size, and the number of cows. I experimented with graphics concepts like environment mapping, image based lighting, morphing, surface subdivision and more and I used a Poisson distribution based trick to uniformly spread the cows on the terrain).

Blog post: A terrain rendering approach.

Posts related to graphics drivers development   (top ↑)

Posts related to contributions to graphics drivers and APIs as part of my work for Igalia's graphics team).

Demos inspired by conferences (These were all written for fun)   (top ↑)

Rust helloworld.
GUADEC 2019 inspired helloworld in Rust

My first program in Rust. I wrote it while attending a Rust/Gtk workshop in GUADEC 2019 where I went with Igalia (sponsor). I used OpenGL and SDL2 though for the graphics. :)

Blog post:
GUADEC 2019 took place in my city! Also: my helloworld in Rust.
XDC 2018 intro.
A short intro inspired by the XDC 2018 logo

XDC 2018 was organized and sponsored by Igalia and took place in A Coruña. This is a short animated intro inspired by the lighthouse of A Coruña that was used in the logo.

Blog post: XDC2018
Mohawk hair simulation.
Hair simulation using a mass-spring system

That was a mohawk hair simulation demo, inspired by some really cool animations at SIGGRAPH 2018 where I went with Igalia.

Blog post:
Hair simulation with a mass-spring system (punk's not dead!)

Other posts about hacks, conferences, presentations (not code)   (top ↑)

XDC 2020.
Blog posts on conferences (attendance or presentation)

Browsers    (top ↑)

WebKit team.
Blog posts inspired by issues I've faced while working on WebKit related tasks:
  • Adoption of ANGLE in WebKit on Linux (slides)
    This talk was about the experiments I performed to find an optimal way to share texture data between the main WebKit rendering process (that uses the native Linux OpenGL driver) and WebGL that uses ANGLE without copying them from one render target to the other. I first tried to use shared EGL/OpenGL context between the main rendering process and the EGL backend, and then Linux dma buffers. The second method had some advantages, as it could work with multiple processes and it didn't require changes in the ANGLE backend but only some EGL extensions to be used in the drivers.