Is it worth switching from Stack to Cabal?

I guess you can still run hpack manualy.

I put a Makefile in my projects and just type make build, etc. Then the make rule calls cabal or stack or cargo or npm or …

That way my muscle memory does the same thing, regardless of build system.

And when I use cabal, I have the rule run hpack before calling cabal.

1 Like

Fwiw, cmake in C++ also encourages if not forces you to list individual file manually… people seem often asking for file glob feature.

Perhaps it’s the reason why some sadistic people like me not super bothered by how cabal works by default :).

I consider this one of the ways cabal is still harder to use.

Attempting to follow my own advice, I looked to see if there was an issue about this. There are a few things bouncing around I was able to find. Most importantly for this whole thread, however, is Feature parity with Stack · Issue #8605 · haskell/cabal · GitHub

4 Likes

Since Cabal (cabal-install) removed sandboxes I switched to Stack and with some global settings it keeps a sanity at Haskell development, especially if you (have to) care about disk space usage.

Sane GHC-devel setup:

  • GHCup for compiler & toolchain installation
  • use that GHCup’s version implicitly with system-ghc global option
  • prevent installation of other GHC versions with install-ghc set to false
  • usually also does help to loose upper bounds for compiler and dependencies checks with allow-newer and compiler-check options
  • use stack-clean-old tool to remove snapshot artifacts to keep global storage really lean
  • optionally set resolver to nightly for global projects if using latest GHC version

If I really need older compiler & libs or limit dependencies versions, I do it explicitly per-project. This procedure makes less headaches then pollution prone cabal-install. Now the experience feels on par even with Rust’s Cargo.

2 Likes

I wouldn’t recommend cabal-store-gc as it falls flat in some cases, which leaves cabal store and packages index in invalid state. Happen to me reproducibly with xmonad and xmonad-contrib installs. Even author himself discourages it from normal use as unreliable.

2 Likes

stack-clean-old does handle Stack’s root, beside others. And seems to be reliable in the long run.

1 Like

Something that I don’t see mentioned but is my #1 reason for switching from Stack to Cabal is that HLS support for executables/test suites is incomparably better in Cabal compared to Stack - https://github.com/haskell/haskell-language-server/issues/366

4 Likes

There’s a better option:

4 Likes

That’s good to know. However I use nix integration so stack is installing ghc through nix anyawy. Is there a way to get ghcup use nix as well ?

Search results for ‘ghcup nix’ - Haskell Community

2 Likes

Both the HLS project and the Stack project are keen that HLS support Stack and that Stack can output the information that HLS needs to support Stack (so that HLS does not depend on ‘hacks’). There is an open issue on Stack’s repository in that regard. From my perspective (Stack’s), what I am missing to help - and need to chase - is a precise specification of the information HLS needs that Stack can provide and in what format. I think HLS needs what is ultimately passed to GHC, but Stack does not know that directly - as Stack builds using Cabal (the library), not directly with GHC.

2 Likes

You can invoke anything from within the stack install hook, including nix. The script has to print the location of the ghc binary to stdout.

1 Like

Updo will give you both Cabal and Stack if using projects (stack.yaml and cabal.project).

2 Likes

Regarding Nix integration, you probably mean this thread

If you, or anybody else, finds a good reason not to deprecate it (it seems almost all Nix+cabal users say it only confuses newcomers and leads them away from good solutions) then please let cabal developers know. I’m not a Nix user, so I can’t tell if it’s “identical” to the Nix+stack integration or not, but any comparison and cross-pollination would be very welcome, too.

1 Like

You may know this, but

  • stack-clean-old is useful for cleaning stack-installed tools and libs
  • ghcup tui is useful for cleaning ghcup-installed tools
  • ncdu is useful for exploring these and other disk hogs in more detail

In the limit, I don’t think there’s any difference in disk usage between stack and cabal - you’ll use disk according to the number of GHC versions you need for the projects you’re currently working on. But I guess stack users will more easily accumulate GHC versions if they’re not being careful.

1 Like

I already mentioned in that thread that I was using stack nix integration. Nobody seems to react.
In one way, I understand, so it is not cabal related, on the other way it might stop me to move back to Cabal (Not that I can’t replicate how I am using stack+nix, but more than don’t have the time nor the energy (trying to do anything with nix when you forgot how it work is exhausting).

1 Like

I thought that Cabal was sharing some objects that stack.
As I understand well stack recompile every thing under each directory, including external packages (unless they are in the pre-built snapshot (I’m problay wrong there)).
So if you checkout the same project twice (like git worktree) and change some code, stack will recompile everything but cabal won’t. I am right ?

I may have misunderstood your point about ‘like git worktree’ and Stack rebuilding, but for local/mutable packages of a project, Stack puts the build artefacts of Cabal (the library) in its .stack-work working directory in the project directory. Most people add that working directory to .gitignore. I don’t think Stack builds unnecessarily.

I didn’t explain well. When I mean like git worktree I mean checking multiple branches of the same project. Basically I work on one project A (directory A) and I need to create a branch to work on a long feature (or I am in a middle of a feature and I need to correct a bug on the main branch etc …) so I checkout the project again in directory A-my-branch (or if I use git worktree, under A/my-branch. This is mean I have now two .stack-work directory (one in A and one in A/my-branch which are nearly identical, yet take the double amount of disk space.
Given that on my project (not sure why, I think it is the doc) a .stack-work is usually about 2 or 3 Gb, having a few branches becomes quickly an issue (when you only have 7Gb left on your hard-drive).
Maybe there is a way to share a .stack-work between two directories.
I understand that Cabal does it naturally by having a global repository.

2 Likes