Monthly Update On A Haskell Game Engine

Hey all, I’ve been working the past month or two in a game engine :slight_smile: (called Ghengin)

This post is not yet a release – I’m still far enough from a version 0.1.0.
However, I’ve come a long way and I’d like to share a few pictures of my progress. Here’s the latest screenshot:

I hope to, soon enough, write a more substantial explanation of the engine’s technical challenges and overall design decisions so far; and on the game developer’s facing side of the engine.

A few key points regarding the main libraries it currently depends on (besides core ones like vector, containers, time, …):

  • The renderer is written using the great bindings to the Vulkan API
  • The shaders are crucial in the overall design, and a lot of code depends on their definition (e.g. preparing render pipelines, allocating descriptor sets and textures, everything materials related …). The shaders are written using FIR, an amazing shader language embedded in Haskell!
  • The entity management, scene graph and render queue are done/created through the apecs entity component system.
  • Vectors and matrices are from geomancy
  • GLFW-b for window management and user input (used as the window backend for vulkan)
  • The dear-imgui bindings for the GUI
  • JuicyPixels for loading textures

A cool note on FIR: the shader’s “interfaces” are defined at the type level, and that type information is used to validate the game-developer-defined-materials, i.e. if you define materials incompatible with your shaders, the program will fail at compile time

The game itself is based on Sebastian Lague’s series Procedural Planets.

The showcase:

The very first achievement was rendering a triangle:

Then I rendered a simple cube and was able to rotate it with a simple model transform matrix:

Later I got a perspective camera which could move around the world. I was generating spheres at this point and the colors show that I was getting closer to generating the normals right too.

I managed to integrate dear-imgui into the renderer at this point, and later on fixed a dreaded off by one error which kept making the GUI behave funny and crash. I was also experimenting with simple diffuse lighting here.

With the GUI in place, I started focusing on developing planets by generating single sphere and modifying the height value of each point on the sphere by noise value: generating terrain and mountains.

After the terrain generation I spent some long weeks on the internals of the renderer before achieving more visual results with the exception of the following color-based-on-height-relative-to-min-and-max-heights planet:

Those weeks were spent in internal technical challenges which I hope to describe on a subsequent post with the resulting design and implementation (and hopefully avoid to some extent the arduous process of understanding and reaching a design and implementation).

This week, with the material system working great for a first iteration, I spent finally some more time on the procedural planets: I added specular highlights to the lighting model (using the blinn-phong model) and added a (gradient based) texture to the planet that is sampled according to the height of each point in the planet. The result is a nicely lit planet with colors depending on the height (lower height = blue for water, middle = green for grass, higher = brown for mountains)

PS: I don’t expect it to be useful without a proper explanation, but the development is going on @ GitHub - alt-romes/ghengin: WIP: Haskell Game Engine on Vulkan. The next feature which is almost done is a gradient editor in the GUI.

Here’s a quick look at the Main module of the procedural planets game.

initG :: Ghengin World ()
initG = do

  -- Planet settings used to generate the planet and which are edited through the UI
  ps <- makeSettings @PlanetSettings
  (planetMesh,minmax) <- newPlanet ps

  -- Load the planet gradient texture
  tex         <- texture "assets/planet_gradient.png" sampler

  -- Create the render pipeline based on the shader definition
  planetPipeline <- makeRenderPipeline Shader.shaderPipeline

  -- Create a material which will be validated against the render pipeline at compile time
  m1 <- material (Texture2DBinding tex . StaticBinding (vec3 1 0 0) . StaticBinding minmax) planetPipeline

  -- Create a render packet with the mesh, material, and pipeline.
  -- All entities with a RenderPacket component are rendered according to it.
  let p1 = renderPacket planetMesh m1 planetPipeline

  -- Define our scene graph
  sceneGraph do

    -- A planet entity, with the planet render packet and a transform
    e1 <- newEntity ( p1, Transform (vec3 0 0 0) (vec3 1 1 1) (vec3 0 (pi/2) 0) )

    -- A camera
    newEntity ( Camera (Perspective (radians 65) 0.1 100) ViewTransform
              , Transform (vec3 0 0 0) (vec3 1 1 1) (vec3 0 0 0))

    -- The planet UI component based on the `ps` settings
    newEntityUI "Planet"  $ makeComponents ps (e1,tex)

  pure ()

updateG :: () -> DeltaTime -> Ghengin World Bool
updateG () dt = do

  -- Every frame we update the first person camera with the user inputs and the planet's rotation
  cmapM $ \(_ :: Camera, tr :: Transform) -> updateFirstPersonCameraTransform dt tr
  cmap $ \(_ :: RenderPacket, tr :: Transform) -> (tr{rotation = withVec3 tr.rotation (\x y z -> vec3 x (y+0.5*dt) z) } :: Transform)

  pure False

main :: IO ()
main = do
  -- Run the game with this init, update and end function
  ghengin w initG undefined updateG endG

If you are curious about the full source of the planets game, beware of dragons :slight_smile:. It is not ready as a learning resource whatsoever.


Awesome work! Excited to see your progress :grinning:


Short update:

The gradient color picker is done and it’s now possible to pick in-game the planet color based on height :slight_smile:


This looks awesome!! I’d really like to see more game written in Haskell. Not trying to discourage you, but have you checked the kleid project? It seems like both project can benefit each other, or maybe join forces…


I haven’t tried keid myself but I am aware of it and have chatted a bit with the author @wiz on some occasions (concretely when fixing the dear-imgui bug and understanding matrix major modes in geomancy).

I started writing an engine because I wanted to, despite knowing about keid. Although comparing them now is fun – they have surely diverged in design decisions.

In due time, in a nice README, I’ll write what I can about related work such as keid.


Keid has a goal that runs a bit contrary to what @romes wanted, namely being the “skip over all the internals and generic implementation details” framework.

Quite a few here really want to know just how the sausage was made and start from the level zero. Which, for the Vulkan API, is way down there than the thing OpenGL is based on.

Being a framework, Keid is closer to Hickory, I think. But I have only a cursory glance of it by now.


Great! another game engine I wasn’t aware about hahaha.

1 Like

If one can name more Haskell game frameworks than Haskell games, what would that tell us? :wink:


There’s a bunch of games listed on Games – Haskell GameDev and a bunch more (older ones) on the Haskell wiki: Applications and libraries/Games - HaskellWiki.


Apparently we’ve missed a february update :sweat_smile:

I’m still busy with incorporating all the stuff that came out after game jams and improving upon the newly discovered warts. Almost flipped the version to 2.0, but despite the initial impulse decided to proceed with 1.8.
New release would lean even more on ResourceT so most of the breakage is due to blowing off useless resource management cruft.
Another thing that I’m busy improving right now is introducing GHC.Generics and Higher-Kinded Data trick for pipelines with multiple instance buffers to remove even more boilerplate. Not particularly important for the pipelines already shipped, but should speed up introducing new custom pipelines.
Tile pipeline got some game jam love and now compatible with Tiled rotation and mirroring flags. And new buffer-to-image helpers make dynamic tile maps a breeze.

Taken by inspiration I finally got to implementing KTX2 texture format that allows native Zstd compression and native Vulkan formats. And the loading can be even faster with less memory usage on CPU and GPU both. Basically PNG-sized textured without sRGB pain.


The next major engine release is finally out and I like it so far.

I’ve been experimenting with apecs stores and found it is very difficult to improve upon the IntMap. Also the apecs itself is a thing of beauty.

SDL2 got a new release and is now able to render textured/colored polygons without diving into OpenGL.

And now I got back to this year’s january jam game project about tiles, units, orders and stuff.
The current focus is unreasonable effectiveness of SDFs while preparing for navigation mesh generation. Perhaps I should wrap it up somewhere and start emergency preparations for LD53 coming this week-end.

As always, the meta-focus is on procedural generation since I have no studio behind me and have to use all the force multipliers I can.

Screenshot from 2023-04-26 22-05-59


What started as a weekend jam ended as a month-long death march, nothing new here :sweat_smile:

What is new, though…

  • tilesetter-codec – A project file reader for Tilesetter, which is an epic tileset editor/generator/database. I really recommend it. Add your tiles, plop them together and it will guide you in making all the necessary transitions (and even generate you some starters). The codec package is only a reader (as usual), but its tests have a code generator example. Which I used to export a huge 4-terrain LUT (actually a case), together with weighted tile alternatives.
  • The tileset (its hot parts) derived from a public domain image dump of that Dune2 miscarried clone. I love the style and it is varied enough to be extended and reused elsewhere.
  • schematosis – A (very, very WIP) library to work with sparse trees using Free wrapper over branching functor, sampling SDFs to them, and then contouring (only squares are marching for now, but also bilinear and (broken) dual). I tried to used contours to generate navigation meshes, but due time pressure ended up just A*-ing over the rendered tile map.
  • gl-block – A library to generate Storable instances generically, according to packed/std140/std430 layout rules.

And, finally, the ld52 demo/workbench slid into ld53 and now presents something like an RTS backbone, featuring:

  • Random map generation (from SDFs, yeah) and rendering.
  • Minimap preview.
  • Unit/group selection and ordering around. Only moving for now, but with a queue.
  • Unit spritesheets with directions (and a mini-tool to cut and place them).
  • Building mode with placement preview.
  • Async path-finding around rough terrain and placed buildings.
  • Async visibility shroud updates.
  • reactive-banana for event handling, apecs for units.

It’s all very rare, but I really need a break. After all the work put into it, there’s so much more groundwork to do. I will finish the UI story and proceed with cleaning/fixing/polishing/extending stuff.

(And, technically, I’m in time to make it the monthly update of May :stuck_out_tongue_winking_eye:)


Incredible work @wiz !! I’m absolutely in awe of it, congrats!
(As for me, I’ll definitely definitely be using gl-block – sounds like a life saver)

Hackdate 20230910, captain’s log :sweat_smile:

Thinking about doing something really, really small this time for LD54 compo.


Famous last words :laughing:


My progress on the game dev front is on hold, but I’ll be hopefully finishing my thesis until the end of the month and have time for it afterwards

Going to have to sit this Ludum Dare out, unfortunately!