As I’ve worked on my Pushrod library over the past year, one of the major struggles I encountered was trying to figure out how to draw using just the GPU, instead of drawing directly to the on-screen Canvas.

I originally started this project using Piston (thus, the engine analogies), and found that library to be overly complicated for what I wanted. Although it provided all of the features I wanted, it introduced a lot of overhead that was concerning. Not to mention, Piston is a more game-oriented library.

For a couple of months, I researched using different libraries for drawing to the screen. Qt/KDE and Gnome libraries were immediately thrown out because they use bindings to the C++ libraries that already have a working Widget library. Other libraries used WebAsm, some use a web browser to render to, but that was not the goal.

I ultimately settled on SDL2.

Why? SDL2 is widely supported on Windows, MacOS, Linux, iOS, Android, and has been ported to game hardware platforms such as XBox and Playstation. SDL2 is lightweight, uses very little CPU, and provides primitive drawing calls, making it easy to draw to the screen via a GPU Texture.

Here is a small article c/o StackOverflow explaining why Textures are a good idea.

The challenge? Getting SDL2 Textures to behave properly when it came to Rust programming.

Pushrod uses a TextureCache that allows Image Textures to be cached in memory and shared amongst other Widgets. The Widget must be part of the same WidgetCache, otherwise the TextureCache will have to be constantly refreshed per use.

A single instance of the TextureCache currently exists per Window created in Pushrod and SDL2. Therefore, the Texture objects can be used over and over to your heart’s content … within reason, of course.

The code:

/// This is the structure for the `TextureCache`.
pub struct TextureCache {
    images: HashMap<String, Texture>,
}

/// This is a `Texture` cache object that is used by the `WidgetCache`.  This is responsible for loading
/// in images into a cache in memory so that it can be copied multiple times as required by the
/// application.
impl TextureCache {
    /// Creates a new `TextureCache`.
    pub fn new() -> Self {
        Self {
            images: HashMap::new(),
        }
    }

    /// Loads an image based on the `image_name`, which is the filename for the image to load.
    /// Returns a reference to the `Texture` that was loaded.
    pub fn get_image(&mut self, c: &mut Canvas<Window>, image_name: String) -> &Texture {
        self.images.entry(image_name.clone()).or_insert({
            c.texture_creator()
                .load_texture(Path::new(&image_name))
                .unwrap()
        })
    }
}

Note, I’m using a HashMap to store the Texture. Since this lifetime has the same lifetime as the TextureCache, there should be no conflict in how long the cache lives. After all, the TextureCache is owned by the WidgetCache which has a lifetime of the ownership of the Engine, which owns the WidgetCache.

(Insert sound of head exploding here.)

See why lifetimes are so difficult? It’s more complicated than that, but that’s the tree you mentally have to traverse. For now, we can assume that the lifetimes of these objects are the lifetimes of the parent objects, or 'static lifetime.

The downside to this approach was having to use the unsafe_textures feature of SDL2 in Rust. The main reason is due to the complexity of lifetime parameters. Once I used the unsafe_textures feature, everything just fell into place.

Where I will have problems at this point is caching fonts. Fonts currently are cached c/o SDL2, however, I want to perform my own cache so I can control access. This means implementing a lifetime of some sort. I plan on doing this at a later date: the code works in the 0.4.x branch, and I’m concentrating on functionality there, rather than stability.

SDL2 has provided me the functionality and the features that were needed to get the Texture feature implementation done - and with very little effort.

I’m very happy with the outcome: fast, ultra-responsive UI with little CPU usage.

But will people use it? Only time will tell.