As an open-source builder and a streamer, I'm afraid I will leak keys on stream any time soon. And fun story—I did leak the API keys to my smart lights once, and the company (Govee) had a 30-day grace period for any revoked keys!
It still looks too tedious to manage all this—curious to see if there's an easier way. Currently I use 1Password in my teams to share .env config, but we basically c/p to local git folders, so there's still a lot to lose.
I'm especially worried about the growing number of supply chain attacks. Curious to see how you tackle these.
Gea in fact supports regular HTML strings out of the box—that's what the compiler turns the JSX into anyway. However IDE tooling is still in the works for syntax highlighting regular HTML.
What syntax would you prefer from Svelte? Like for hooks / stores, or rendering?
Heh, sorry, I (the author) wasn't the one who created the post. But the idea is reactivity in JS shouldn't require new syntax.
Gea works best with the compiler, I documented a non-compiled (only compiles JSX) browser usage here: https://geajs.com/docs/browser-usage.html but this obviously requires manual store observers and manual DOM updates, which means it's not _really_ benefiting from Gea.
Thanks for your insights. I was originally hesitant about the performance of proxies, too, but they turned out to be great. The benchmarks (https://geajs.com/benchmark-report.html) also show good results. Both in terms of memory and CPU cycles, even though proxies are obviously adding an overhead, it's not day and night (https://jsben.ch/proxy-vs-object-performance-benchmark-dtxo6 is a good test for this). With a proxy, you can set a property 25 million times per second (on my M4 Max machine in Safari) with only 4% perf loss vs defineProperty, and Chrome is about half the perf with 20% loss vs defineProperty. So, still, 12.5 million setters per second is pretty good. Of course if your use case demands more performance, nothing beats a hand-optimized vanilla JS code.
Since Gea doesn't rerender the template _at all (well, for the most part, at least)_, in theory we wouldn't really gain much from getter memoization, mainly because we create proxy observers for computed values that update the DOM in place only when the underlying value updates.
And since stores are plain old JS classes, there's no need for an "async store" concept. Just update a value in the store whenever you want, track its loading state however you want, either synchronously or asynchronously, and the observers will take care of them. If you refer to another pattern that I'm not aware of, please let me know.
However, in my simple self written benchmark that compares the time it takes to sum the property values (i.e., getter) of 100 million proxies vs plain objects, the result is that the proxies are 13x slower.
When benchmarking the setting of the property value of the 100 million proxies vs plain objects, the result is that the proxies are 35x slower.
My simple benchmark gives results that significantly deviate from the linked benchmark. Regardless, the relevance of the performance implications of proxies should be evaluated on a case by case basis.
Regarding the memoization, I was primarily referring to accessing the getter multiple times (i.e., not necessarily in DOM rendering), which can cause unnecessary computation in Gea (as far as I can tell). In my envisaged use cases, this could often lead to problems (with, e.g., large data sets).
How would this be accomplished in Gea so that reactivity would be preserved (i.e., mutating store0.url would trigger a new fetch)? Is it possible to "listen" to changes to store0.url and handle the async code?
Very interesting benchmark results... well, I guess that's proxies for you.
Getters in Gea are designed to be used as computed properties by the components' rendering cycles, and in that, they don't need explicit memoization. I believe users can implement their own memoized computed variables if they require repetitive access—in which case each store's `observe` method could be utilized in updating the memoized value whenever a specific value changes in the store.
And for the async case, for now the compiler doesn't handle this scenario. It could be added, but as you expect, for now the same `observe` method could help here as well to listen to changes on store0.url and refetch accordingly.
I wish I could bring it to React! That would save so many developers and so much natural resources!
I've been working on the library for 6 months, and it's built upon my previous libraries tartJS (2011), erste (2017) and regie (2019). I just like to squash my commits before I make a public release, and that just happened 4 days ago :)
This is an interesting idea, though, thanks for bringing it up! I will think about it and try to add it to the compiler. Especially native objects like `window` would be great to create handlers for in the compiler. It would make life really easy for the developer.
Yeah, I experimented with this a while ago, basically compile templates to get their dynamic data dependencies and then poll them for changes, although I did it at runtime. I was able to have thousands of components running with fairly minimal cost, turns out strict equality checks are pretty cheap, although there was issues with defining custom objects or anon functions inside the template. A compiler could be much smarter about this, including nested field access on objects, ignoring temporary objects and functions, and deduping checks so only the minimal amount of values would be polled each frame. Values out of the viewport can "sleep" or be polled on a lower frequency. It's kind of an interesting paradigm because it doesn't require anything on the part of the developer to write their state management in a specific way or to wrap third party dependencies in a reactive wrapper.
Of course polling is not fashionable as it's seen as "crude" or "unoptimised" but typically most UI's only have a few dozen input sources at most visible on a screen, and polling that amount of data amounts to less than a ms even on slower mobile hardware. In extreme cases with thousands of data points the compiler could be smart and "short-circuit" the checks, or the component could opt into manual update calls.
But it's super powerful to have what amounts to true immediate mode UI, the entire issue of state management basically goes away. `window.state` becomes a perfectly viable option haha.
Solid is honestly a beast, and I love it! It was a great challenge to even match its performance, let alone beat it. While on several metrics they are within margin of error, select row, swap rows, remove row, and clear rows performances are significantly better on Gea's side.
Having said that, pure performance wasn't the goal as much as the developer experience. I wanted to build something that felt _native_ to the language.
And thank you for the comment on accessibility—I just updated the website and the docs to make the text more legible.
But all of those metrics differ by something like 1 millisecond, and you've only got benchmark data from an M4 Macbook Pro. On the strength of this, you promise us:
"The fastest compiled UI framework — ahead of Solid, Svelte, Vue, and React."
I know you've put quite a bit of work into the underlying libraries here, but this is the sort of claim people are sure to poke at. Is the Gea code used in the benchmark published anywhere?
As the author... I somewhat agree :) But I really like synthwave and ever since I came across the design trend I wanted to use it somewhere. And I put twinkling stars on purpose, there's even shooting stars if you wait enough. I understand it comes across as generic AI slop, but this is an early project and it will evolve. I will work on a planetscale-style webpage and maybe I can add it as an option you can toggle on :)
It still looks too tedious to manage all this—curious to see if there's an easier way. Currently I use 1Password in my teams to share .env config, but we basically c/p to local git folders, so there's still a lot to lose.
I'm especially worried about the growing number of supply chain attacks. Curious to see how you tackle these.