i became curious if it is possible to curl a website incl. all linked pages into a working (set of) offline .html?
I did some Googling and what I can see the curl utility won't really do that. That isn't in its feature set. But there is a tool called 'wget' which can do that. Unfortunately it isn't included in MacOS by default. So you'd either have to compile this tool yourself (probably not in the scope of your skills) or maybe it's part of the 'homebrew' system which is a way to install more command-line tools for Macs in an easier way. Look for it if you want to spend more time in the shell.
For everyone on this thread, having a Unix command-line shell on MacOS is a big bonus to old Unix-heads like myself. I've been using Linux systems on the command line for almost 30 years and the same skills mostly apply to the MacOS shell as well. I used NeXTStep systems back in the day and they were even more obviously a Unix workstation. MacOS is a direct descendant of NeXTStep but it tries very hard to hide the Unix underneath.
You will find a lot more information on the web about using the Unix shell on Linux systems because it is currently the most popular version of Unix-like systems. Most of the information on those pages will apply to MacOS except that MacOS has a different filesystem layout compared to Linux systems. The shell concepts are the same (pipes, common Unix utilities, etc). Commands like 'sed', 'ls', 'grep', 'find', etc. It's all the same.