This concept is rarely used in codebase and very much error-prone
if you forget to check it.
Instead, make it so that operations that would produce invalid integers
return an error instead.
Instead of letting every [[Call]] implementation allocate an
ExecutionContext, we now make that a responsibility of the caller.
The main point of this exercise is to allow the Call instruction
to write function arguments directly into the callee ExecutionContext
instead of copying them later.
This makes function calls significantly faster:
- 10-20% faster on micro-benchmarks (depending on argument count)
- 4% speedup on Kraken
- 2% speedup on Octane
- 5% speedup on JetStream
Object defines an is_error virtual method to be overridden by Error for
fast-is. This is the same name as the Error.isError constructor method.
Rename the former to avoid conflicts, as GCC 15 just started warning on
this.
This allows us to get rid of instructions that move arguments to locals
and allocate smaller JS::Value vector in ExecutionContext by reusing
slots that were already allocated for arguments.
With this change for following function:
```js
function f(x, y) {
return x + y;
}
```
we now produce following bytecode:
```
[ 0] 0: Add dst:reg6, lhs:arg0, rhs:arg1
[ 10] Return value:reg6
```
instead of:
```
[ 0] 0: GetArgument 0, dst:x~1
[ 10] GetArgument 1, dst:y~0
[ 20] Add dst:reg6, lhs:x~1, rhs:y~0
[ 30] Return value:reg6
```
Previously we blocked using locals for function arguments whenever
`arguments` was mentioned in function body, however, this is not
necessary in strict mode, where mutations to the arguments object are
not reflected in the function arguments and vice versa.
It turns out it was a mistake to make this a virtual since
ServiceWorkerAgents are effectively the exact same as
DedicatedWorkerAgents and SharedWorkerAgents just with [[CanBlock]]
set to false.
This helps unwind a niggly depedency where the VM owns and constructs
the Heap and the Agent. But the agent wants to have customized
construction that depends on the heap. Solve this by defering
the initialization of the Agent to after we have constructed the
VM and the heap.
By doing that we avoid doing separate allocation for each such vector,
which was really expensive on js heavy websites. For example this change
helps to get EC allocation down from ~17% to ~2% on Google Maps. This
comes at cost of adding extra complexity to custom execution context
allocator, because EC no longer has fixed size and we need to maintain
a list of buckets.
This is better because:
- Better data locality
- Allocate vector for registers+constants+locals+arguments in one go
instead of allocating two vectors separately
For the slight cost of counting code points when converting between
encodings and a teeny bit of memory, this commit adds a fast path for
all-happy utf-16 substrings and code point operations.
This seems to be a significant chunk of time spent in many regex
benchmarks.
By doing that we consistently use Identifier node for identifiers and
also enable mechanism that registers identifiers in a corresponding
ScopePusher for catch parameters, which is necessary for work in the
upcoming changes.
similar-origin window agents have the [[CanBlock]] flag set to false.
Achieve this by hooking up JS's concept with an agent to HTML::Agent.
For now, this is only hooked up to the similar-origin window agent
case but should be extended to the other agent types in the future.
Instead of using the more generic define_native_accessor() here,
we poke directly at indexed property storage for the parameter map.
We can also construct the NativeFunction objects directly, without
giving them names like "get 0" etc, since these are not observable
by userspace.
Finally, by using default property attributes (not observable anyway),
we get simple indexed storage instead of generic (hash map) storage.
Instead, let JS::NativeFunction store the AK::Function directly, and
take care of conservatively marking its captured data.
This avoids an extra GC allocation for every JS::NativeFunction.
These are slightly unfortunate as we're crossing the library boundary,
but there's precedent with Object::is_dom_node(), and these are just
knocking down a few more items that were showing up in profiles.
This is very frequently returned by Object.prototype.toString(),
so we benefit from caching it instead of recreating it every time.
Takes Speedometer2.1/EmberJS-Debug-TodoMVC from ~4000ms to ~3700ms
on my M3 MacBook Pro.
This removes another Match member that required destruction. The "API"
for accessing the strings is definitely a bit awkward. We'll think of
something nicer eventually.
Our engine already keeps track of the home realm for all objects.
This is stored in Shape::realm(). We can use that instead of having
a dedicated member in ESFO for the same pointer.
Since there's always a home realm these days, we can also remove some
outdated fallback code from the days when having a realm was not
guaranteed due to LibWeb shenanigans.
If we have two PrimitiveString objects that are both backed by UTF-16
data, we don't have to convert them to UTF-8 for equality checking.
Just compare the underlying UTF-16 data. :^)
This allows us to remove the BoundFunction::m_name field, which we
were initializing with a formatted FlyString on every function binding,
despite never using it for anything.