> And impressively, PyPy [PyPy3 v7.3.1] only takes 0.31s to run the original version of the benchmark. So not only do they get rid of most of the overheads, but they are significantly faster at unicode conversion as well.
Wow, that's pretty impressive. I never really got to use PyPy though, as it seems that for most programs either performance doesn't really matter (within a couple of orders of magnitude), or numpy/pandas is used, in which case the optimization in calling C outweighs any others.
If you are concerned about performance, it really shines. It speeds up computational workloads, for sure. But it also improves performance in lots of I/O scenarios too.
Anything where function call overhead is an issue and you don't have a native library escape hatch. Parsers written in Python are inherently slow. Especially so with parser combinators.
Well, anything that you need to do where the libraries are there and waiting for you.
If you get into the territory of missing libraries it can be a bit of a pain. Otherwise, it's a breath of fresh air as it's almost a drop in replacement.
Just to clarify, does it matter if these libraries are pure Python? What I have a hard time understanding is how PyPy is almost a drop-in replacement but then have issues with missing libraries. Couldn't you just pip install the libraries or just literally get the source and run them if they're pure python?
While unlikely to be a common use case, it's super handy to deploy onto Flatcar Linux because pypy is just "unpack the tgz and run," without dealing with a lot of crazy shared library dependencies
Wow, that's pretty impressive. I never really got to use PyPy though, as it seems that for most programs either performance doesn't really matter (within a couple of orders of magnitude), or numpy/pandas is used, in which case the optimization in calling C outweighs any others.
Can anyone share use cases for PyPy?