If you want to easily read or write files anywhere, or if you want to edit/develop those scripts easier.
Not everything is a web-accessible service, and sometimes you do care about short startup time -- for example, if you have a file conversion script that you need to run on 1000's of files.
You can kinda-sort fake it with "docker exec", but it is really awkward, and at some point, pyenv etc.. is just easier.
For 1, this runs a new docker container for each file. This is like 0.3 seconds on my PC for example - or 5 minutes for 1000 files. And only current directory is not enough, you may want to “convert input.foo /sev/webroot/output.jpg”.
So you need a better script. The one which will keep container running and use “docker exec”, and rewrite command lines for absolute paths. The one which will relaunch it as needed to map more dirs, and will shut it down eventually. Once you write it, you’ll likely end up with something way harder than just setting pyenv.
2. Yes, I know. See above. This command will be super annoying, as things like “aws s3 cp” would not work and it would not even see your authorization. So you need to make it longer and more complex. And then it will be harder than just a venv install.
3. Neat! It does look workable if your dependencies are very complex. In most cases, however, I’d say venv is still nicer, as you can use pydoc3, graphics, don’t have to worry about mapping input/output files, can use related command line tools and so on.
Not everything is a web-accessible service, and sometimes you do care about short startup time -- for example, if you have a file conversion script that you need to run on 1000's of files.
You can kinda-sort fake it with "docker exec", but it is really awkward, and at some point, pyenv etc.. is just easier.