Glyphs are rasterized once and stored in a texture atlas. When rendering a glyph, the fragment shader pulls from that texture. Once loaded, the glyph stays loaded for the duration of the program.
Got it, just a heads-ups that texture atlas tends to hammer your GPU texture upload if you want to support UTF or non-latin(esp glyph-based) character sets.
Not trying to be discouraging just something to keep in mind if that's a direction you want to go. Pretty excited to see a GPU + Rust based stuff making it out into the wild.
Sure, but you're either going to have to generate the whole font up-front(can be many thousand characters) or you need to re-upload as you use/generate new characters which can thrash unless you're very careful about the regions you lock(and you have a driver that behaves appropriately).
Most font renderers I know do a tiered LRU cache of 3-4 texture "pages" which hurts your drawcall batching but tends to be a nice tradeoff in texture usage.
Okay so I dug into this. There is a font cache on the GPU and another in CPU ram. I believe it will fall into the drawcall batching issue you are concerned about... but terminals don't need to get >60FPS in most cases.
You really have nothing better to render on your GPU and store in that video memory than your terminal? I also use a web browser, and feel like it could use a performance boost a lot more than my terminal (particularly as I don't actually believe that using textures in this way is actually the most efficient way to render fonts with OpenGL).
There's Distance Fields[1] and Loop-Blinn[2] aside from standard textures that I'm aware of.
Valve uses the first, very few people use the second because of patent issues. They're more computationally expensive in some cases so there's always tradeoffs to be made depending on your hardware.