I’m planning on working next year on improving monad-bayes, which is a Haskell library for probabilistic programming (GitHub - tweag/monad-bayes: A library for probabilistic programming in Haskell., https://www.tweag.io/blog/2019-09-20-monad-bayes-1/) and customizable Bayesian inference. I thought I’d post here to see if anyone had ideas or opinions on improvements and/or extensions that they’re interested in seeing in this space. Broadly, my plan is to start by doing some refactoring, and also making the library a bit more user friendly, with easy to run examples and documentation for all its functionality.
I’d like to work on gradient based inference methods, but my understanding is that the ad library is not hugely performant. Is that true, and if so, are there any workable alternatives, e.g. with Accelerate?