On the contrary, that window is both huge and also you're assuming exceptions require expensive allocations but they don't.
Exceptions are for exceptional situations, so the performance impact of them when thrown is rarely of concern - both in C++ and in pretty much every other language with exceptions. There's a reason something like Java still uses error return values for an awful lot of things, after all. But importantly exceptions means that you're not doing return value checking for errors that rarely happen, which can get expensive on the whole. As in, exceptions gives your code a path to both be cleaner and faster on average - they are really very useful when implemented well.
The problem with C++ exceptions has nothing to do with the memory allocation cost when thrown (which as mentioned, there's other strategies you can deploy than the default of just `new`). Rather it comes from the significant impact on binary size and the reliance on RTTI.
The bloat from the exception tables isn't really ideal, but it is the tradeoff of allowing for exceptions to be zero cost in the non-exceptional cases.
I'm not sure RTTI for exceptions is such a big deal.
I'm guessing you are referring instead to the the fact that exception catching needs pull in most of the dynamic casting machinery in many cases, since exception hierarchies at least in theory can have multiple inheritance, virtual inheritance, and other complications, which means that not only is determining if a specific exception thrown is compatible with a given catch clause non-trivial at runtime, but even once determined, the cast may be non-trivial, not to mention the possibility of catching by value needing to invoke a copy constructor.
If a language catches by type, some form of run time type indication in the exception object is fundamentally needed if the language allows throwing an exception whose concrete type is not known at initial throw time. C++ is such a language, as you can throw an exception that was allocated elsewhere, and which you received via a pointer to a non-final class.
In a language where the concrete type is always known at initial exception throw time, then the stack unwinding code could conceptually simply identify any catch blocks that would apply from data in the exception tables because all superclasses are statically known. And it could pregenerate code for any possible upcasting or copy construction needed, so no dynamic casting machinery would be needed.
(But many other static languages with exceptions only allow single inheritance, don't allow catching by interface, and only allow catching by reference, so no fancy copying or upcasting code is needed. Almost all of them do allow throwing without knowing the concrete type, so they need some form of RTTI data nevertheless.)
The RTTI requirement is more of an issue since that's forcing RTTI on for all types, not just those that can be thrown as exceptions. An explicit "throws" syntax would eliminate that (since the thrown type is now statically known and doesn't require RTTI) and thus significantly cut down on the cost.
Alternatively, and this isn't very "C++" but would work, it could be required that all thrown exceptions must inherit from a base "Exception" type such that you then only need to require RTTI for that chain of types instead of all types.
Exceptions are for exceptional situations, so the performance impact of them when thrown is rarely of concern - both in C++ and in pretty much every other language with exceptions. There's a reason something like Java still uses error return values for an awful lot of things, after all. But importantly exceptions means that you're not doing return value checking for errors that rarely happen, which can get expensive on the whole. As in, exceptions gives your code a path to both be cleaner and faster on average - they are really very useful when implemented well.
The problem with C++ exceptions has nothing to do with the memory allocation cost when thrown (which as mentioned, there's other strategies you can deploy than the default of just `new`). Rather it comes from the significant impact on binary size and the reliance on RTTI.