I’m planning on working next year on improving monad-bayes, which is a Haskell library for probabilistic programming (GitHub - tweag/monad-bayes: A library for probabilistic programming in Haskell., https://www.tweag.io/blog/2019-09-20-monad-bayes-1/) and customizable Bayesian inference. I thought I’d post here to see if anyone had ideas or opinions on improvements and/or extensions that they’re interested in seeing in this space. Broadly, my plan is to start by doing some refactoring, and also making the library a bit more user friendly, with easy to run examples and documentation for all its functionality.
I’d like to work on gradient based inference methods, but my understanding is that the ad library is not hugely performant. Is that true, and if so, are there any workable alternatives, e.g. with Accelerate?
I’ve been meaning to read it :). Brendan Fong is a top-notch applier of category theory. The abstract made intuitive sense and seemed to indicate beautiful contents within.
Even if one is going to stick with the “bastardized” category theory in base, rather than the more general and true-to-the-math stuff in categories: Categories, I think it is still generally worth one’s while to read about the “real” stuff. There are lots of optimizations and fancy techniques that don’t work in “hask”, but do work in a more general setting it is at least good to start thinking about.
More to the point, I suggest those interested in monad-bayes read its author’s thesis : https://www.repository.cam.ac.uk/handle/1810/295167 , which builds up all the necessary theory more or less from scratch. Fong in comparison takes you for a walk in the park of category theory, shows you the views, but does not provide constructive tools.