Hacker Newsnew | past | comments | ask | show | jobs | submit | vladev's commentslogin

Interesting that they use Rust for most things, rather than Deno. I would've expected more dog-fooding on their part.


So the reason we do this - and I should have mentioned this in the post - is that we explicitly do not want to dog food here. Because JSR is the package registry, if it goes down, and we can not pull packages from the registry during deployment of a new version of the registry (that fixes the reason it is down), that would be very bad. So we don't put anything that depends on the registry in the path of deployment of the registry :)

Also, a lot of the work that the registry does (parsing source code, analyzing dependencies, enforcing rules on the code, etc) are pieces of code that are shared with the Deno CLI, which implements this in Rust. The reason for this is that this work has to happen in Rust, because JavaScript, for the most part, can not "self-host" itself (ie you can not implement a JavaScript runtime on a JavaScript host if you want your runtime to provide more system bindings than the host runtime).

Finally, we do use Deno for many pieces of the registry where the "circular dependency" problem is not relevant. Namely the entire frontend uses Deno and Fresh :)


The Javascript and Python ecosystems, somewhat ironically, have begun using rust for their tooling, because the actual language is too slow.

Aside though: IMO python is good as "glue" between low level code. JS... I don't know man.


My dream language/environment would be something that is ubiquitous, has a straight-forward learning curve, has a REPL, and empowers dev productivity - like python - but then, if i should choose, there would be an additional straight-forward path for me to push my code thhough some compilation process in order to supercharge the performance/speed of my code - like Go or Rust, etc...Basically let me choose to employ my code as a (dynamic) script, or compiled code...sort of like some hybrid of python and Go, or python and rust, etc. but combined into a single lang./env. :-)


I believe this is essentially the goal of Mojo [0][1], at least for ML/AI development.

[0] https://www.modular.com/max/mojo

[1] https://news.ycombinator.com/item?id=35790367


Another rabbit hole for me to discover! Thanks!


That's Java. You can work in the REPL, and when you put your code in a file and launch it it runs fast; very fast.


I can't deny the performance and scalability of Java. But I've never been a fan of the need to be so verbose when coding for it. Also my memory of Java is from the late 90s...so maybe I'm missing something but I can't recall there ever being a REPL in Java. So I guess TIL :-)


Not only is there now a REPL, but the language is not at all what you remember from the 90s.


That is what Scala claims to be. https://www.scala-lang.org/

Scala has its own downsides tho. It is a "big" language in the sense that because it tries to do everything, there are too many ways to do one things.


Common Lisp, maybe?


Clojure is probably closest to that right now.


Interesting; i guess i have a rabbit hole to dive into. Thanks forvthe suggestion!


Deno is written in Rust, so it makes sense to write stuff like Typescript compilation in the same ecosystem as your Typescript runtime.

It sounds like they use Fresh (deno framework) for the web ui parts.

JavaScript for UI, Rust for data


Not really. Most software nowadays is like When I pay it is Rust for best performance when user pays(in perf or actual dollars) ..blah..blah.. JS/Electron for the win


You can use the `--local` flag to build on your own computer/CI/VPS. `eas build --local -p ios --profile production` can locally build you an iOS bundle. Don't expect it to work out of the box, though. Your machine should have all the XCode tooling correctly installed and configured. Works for Android as well.


> Your machine should have all the XCode tooling correctly installed and configured. Works for Android as well.

I mean that's the case for the 'regular' React Native workflow as well, and most mobile / crossplatform development. That said, yeah getting RN running is a bit more effort compared to Expo, but it allows for a bit more flexibility.

We decided against using Expo because we had a number of 3rd party dependencies (trackers, analytics, chat, sigh) that had native components; it can work with Expo, but the dependency has to implement support for it, and that was a bit all over the place.


AFAIK using -—local still requires an eas account though.


It does. It's possible to make a patchset that lifts this restriction, though. If you're interested, I'd be willing to make it available.


The Expo setup that I use never calls into Expo servers at build time. For OTA updates, I've managed to get my self-hosted Expo update server working too, although I'm not using it at the moment, as its incompatible with the React Native New Architecture (it's the last expo package to lack this support - hopefully they'll add it soon).


Do you have the self hosted OTA open-sourced? I tried implementating it, but it didn't work for some reason.


Unfortunately my update server contains a bit more stuff which I'm not planning to clean up at the moment, but the logic of the update server really is taken right from their https://github.com/expo/custom-expo-updates-server repo. Perhaps you were blocked by this https://github.com/expo/custom-expo-updates-server/issues/12 ? If so, its been fixed in the meantime, perhaps it works for you now?


Just to make sure: if the company goes bankrupt, will I be able to continue releasing updates etc.?


The Expo Updates protocol is an open standard specified here https://docs.expo.dev/technical-specs/expo-updates-1/

The hosted services offered by the Expo team called EAS has an implementation of an updates server that conforms to that protocol. If EAS went away or you wanted not to use EAS, you could write and operate your own server that conforms to the protocol instead.


please do so, I would like to build it totally local and when I realize that outsourcing the build to the EAS cloud saves more time and money I will switch later, that sounds like a nice way to get more people use Expo.


I've put a link to it on my other comment. Hope it helps.


I believe it still uses EAS for certificates by default. You can use "prebuild" to get the ios / android directories. Alternatively start a React Native app and install their modules separately. These options still work well but require a lot more setup.


Thanks. I wasn't aware that eas build works locally as well (since they deprecated expo build).

Is the build job also open source somehow?



Thank you very much and I'm glad you were able to get it working without the cloud stuff.


I would say that Arch's philosophy is about Exposed simplicity vs Hidden complexity. The wiki is extensive and covers a lot of cases one might stumble upon. Sometimes it feels like Arch's goal is to teach you how it works end-to-end. As a byproduct you get a working OS.


You'll be surprised, but it's RedHat that made most of that happen. A lot of the important Linux projects are developed mostly by them - Gnome, Wayland, NetworkManager, Pipewire, flatpak, etc.

Ubuntu, on the other hand, seem to like to do many things their way. Like aggressive patching. I recall fontconfig being heavily patched from upstream. Then we have Mir (now almost abandoned), Unity (abandoned), snap (flatpak, done differently, not yet abandoned :).


Well:

* NetworkManager is a dumpster fire.

* Wayland is 10 years in the making and it's still barely out of alpha and still missing crucial features such as fractional and consistent scaling.

* Pipewire is a welcome attempt to mitigate another source of grief, Pulse Audio.

* Gnome? Lets not get started...

Please. RedHat has been a boon to the Linux community but it's lack of - how can I put it tactfully - design taste? has stranded the platform into a decades-long quicksand of endless circular reinvention.


You're not wrong on wayland. I remember hearing people talking about how great wayland was gonna be a decade ago.

Gnome and Network Manager are fine though. My XPS 15 wifi has been supported for the last three years without any issues.


Wayland is a lot better than X. The problem with any of these low level enhancements is that they take a long time to plumb through the entire system. I wish it wasn't so, but it is.


"still missing crucial features such as fractional and consistent scaling"

This has very little to do with Wayland and mostly has to do with the apps and the toolkits. I haven't really seen many Linux native apps that are able to function correctly at a non-integer scale. The rendering of these apps may have to be entirely changed and refactored to use floating point instead of integers. That a big thing to ask every app to do. The hard part is doing that, and then it's trivial to put a flag in Wayland (or XWayland) for an app to say that it supports it.


Clipboard is consistently broken on Wayland. I don't use any XWayland application, it's all native Wayland. Sometimes when I copy things they don't paste, and I need to switch back to the original window for the copy to register. Or even copy again to be sure. This was never, ever a problem in more than 10 years using X11.

But, fractional scaling is working like a charm on Gnome + Wayland (after a gsettings command). Very crisp, despite people saying it doesn't work. On X11, even on KDE, I can't get fractional scaling this crisp. This is the only reason I'm using Wayland; all the rest sucks.

But, the problem is, GTK doesn't support fractional caling natively. Even GTK 4 supports only integer scaling. So for fractional scaling the compositor has to scale up, then down. This approach generally causes blurring (though I don't know why Gnome on Wayland here isn't blurred).

When I see screenshots of people with fractional scaling on Gnome, it appears very blurry. Comparing side by side, here it isn't. I don't know why and at this point I'm afraid of messing it with and ruining everything.


"Sometimes when I copy things they don't paste, and I need to switch back to the original window for the copy to register."

I have honestly never had that problem in years of using GNOME Wayland, but I experienced it many times with misbehaved X11 apps. Maybe you want to come up with a reproducible test case and then report it?

"Even GTK 4 supports only integer scaling. So for fractional scaling the compositor has to scale up, then down. This approach generally causes blurring (though I don't know why Gnome on Wayland here isn't blurred)."

That's what I mean, it has to be done in the toolkits first. The first step would be to add support for that to GTK which is unlikely to happen until at least GTK5. Then after the apps can start to support it, I think you're looking at at least a few years before there is a realistic chance of having that. Sorry to disappoint, it's just not an easy thing to have. And I don't think there is much incentive to support this from a hardware perspective either since most people that I see asking for this are using it as a workaround for oddly sized 4K monitors.

"This is the only reason I'm using Wayland; all the rest sucks."

I feel your frustration but I actually would not suggest using Xorg in 2021 unless you really know what you're doing. It's not secure unless you take great pains to make it so. GNOME's Wayland session is the most secure Linux desktop there is outside of security-focused distros.


I think the main (or only? not sure) culprit is Firefox (running natively on Wayland) https://bugzilla.mozilla.org/show_bug.cgi?id=1726360 - it reports to be fixed in Firefox 93 but I'm on Firefox 92 still, I guess I should just update.

> And I don't think there is much incentive to support this from a hardware perspective either since most people that I see asking for this are using it as a workaround for oddly sized 4K monitors.

All 14" hidpi laptops (meaning, more than 1366x768) suffer from this problem. 1x is too small, 2x is too large.


Those laptops I think would fall in that same category of displays, where the PPI is around 120-180. It's not enough of an enhancement to make the text crisp and not pixelated, and it makes everything look bad unless the apps native to 96 PPI start to implement a certain type of floating point scaling. The higher end laptops of the same size just give you a the same screen but with a higher PPI. Scaling up then scaling down doesn't cause noticeable blurring once you get past 250 PPI, so it's only that class of low end displays/laptops that would benefit from this. I wonder if those displays will even still be around in a few years time, I certainly would like to get a cheaper laptop around that size but with a higher PPI.


From my point of view, my screen - 1920x1080 at 14", at 189ppi, isn't really what I would call low end. Or at least, not in my country (Brazil).

Low end is 1366x768 at 14", which is 135ppi. Most laptops I see have this resolution.


I mean low end in the category of high DPI laptops. I think that would be considered mid range in the category of all laptops, that was my take from looking at prices recently anyway. There are high end laptops that are around the same size and have 4K screens.


> think the main (or only? not sure) culprit is Firefox (running natively on Wayland) https://bugzilla.mozilla.org/show_bug.cgi?id=1726360 - it reports to be fixed in Firefox 93 but I'm on Firefox 92 still, I guess I should just update.

I was just about to write the same, the only clipboard issues I've had in wayland were with firefox. I don't know what they do, but even in windows I get weird behavior at times (e.g. when copying out of Jupyter cells for example), much less than wayland though.


Lack of functional fractional scaling is what makes me want to switch to Windows or Windows+VM or Windows + ssh to compile.


Ah sorry... so the major issue is not with Wayland itself, but with the major toolkits that in 10+ years haven´t bothered or figured out how to transition away from a rasterized buffer model.

Right. It doesn't matter if the fault is with GTK, Qt or the Wayland APIs not providing enough support for this to happen, sometime in the last 10 years. The problem is that it didn't.

In the meantime I've been using MacOS that has been doing vector graphics since what? 15 years?

SMH...


This isn't about vector graphics. There is always going to be a rasterized buffer, that's unavoidable. Even more so as apps are moving to newer rendering APIs like Vulkan and Metal. Apps doing custom rasterization is probably going to get even more common. The real issue there is what scale you do the rasterization at, because some things just cannot be expressed in terms of vectors. I believe Qt does support doing floating point scaling in some cases [1] but I've heard it's really broken because the apps still need to make changes to support it. You can't just take any old Qt app and recompile it with the moral equivalent of "#define int float".

This is also not a matter of "fault", by necessity the majority of the work that needs to be done here in the toolkits and apps. That's just the way it is. It's not that hard to implement this on the server side, you just don't scale that buffer when rendering. MacOS is an not a good example because that also uses an integer scale.

[1] https://doc.qt.io/qt-6/highdpi.html


Ok, let me rephrase. What I mean is - how I see it - that Apple just made the choice for every developer and gave them a conceptual model and tooling that does not depend on individual pixels or require them to make an extra effort in adoption (because you know how that goes.)

It baffles me that Linux is still stuck on this pixel grid that just hangs together at a specific density of 96dpi; something that harks back to the VGA days

https://docs.microsoft.com/en-gb/archive/blogs/fontblog/wher...


Well, Xorg is also missing fractional and consistent scaling - because there are many bugs with it


I use fractional scaling with Xorg so it is not missing. Not sure about consistency. It is acceptable for me though.


I get annoyed when I see posts like this with revisionist history.

I can't comment on everything but:

> Unity (abandoned)

You probably don't recall how bad Gnome 3 was initially, and how much better Unity was over Gnome 2. Sure it also had its issues in the beginning, but it took a really long time for Gnome 3 to catch up.

> snap (flatpak, done differently, not yet abandoned :).

I am pretty sure flatpak didn't exist when snap was announced. Or it was around the same time.

Given the above two, I'd place very little credence on the rest of your points. I agree with the Mir situation to some extent, but look at how long wayland has taken to ship.


> You probably don't recall how bad Gnome 3 was initially

I do, and I have never seen its face again since then and the huge upsetting of ruining what worked well, and to create something perverse. And unless it was fixed very recently, it still is, because I just tried `gnome-font-viewer` and there are fades in the interface that cannot seem to be disabled. Unjustifiable. Not just the desktop manager: the paradigm. Dis-functional effects imposed to the user for no reason. And in a context which replaced functions with minimalism.

I am still wondering what caused that stroke of lunacy - at the time, I thought it must had been literally a stroke.


> I am still wondering what caused that stroke of lunacy - at the time

Power users have privilege and need to be hobbled; their complaints ignored. The needs of users with the least privilege, the computer illiterate, should be prioritized.

Or something like that.


The general sentiment I see is that if a feature is only used effectively by a small handful of users then it's probably going to risk getting removed. That's just numbers, it doesn't make a difference whether it's "power users" or any other users. Either way maintainer time is limited and sometimes they have to make a decision to drop a feature that isn't pulling its weight.

But I've also never heard any description of the phrase "power user" that was clearly defined. Wikipedia says:

"A power user is a user of computers ... who uses advanced features"

"In enterprise software systems, 'Power User' may be a formal role given to an individual who is not a programmer, but who is a specialist in business software"

So which advanced features and which business software are we talking about here? That could be anything really.


> if a feature is only used effectively by a small handful of users

We do automated computing to have tools in general, as we may need them; and we use special features when we need them.

Figure imagemagick discriminating options or functions the same way ("Lanczos stays in, but we could ditch Hamming").


Those special features all have a maintenance cost, and if no one is around to pay it then their usefulness will diminish until they reach the threshold where it's not worth it anymore. Not sure why any filtering algorithm would be considered out of the ordinary. Do you know how many scientific and mathematics packages I've seen that are are really old, outdated, and suffering bitrot? Quite a few. In fact that seems to be a field where the old algorithms are discarded fairly regularly in favor of new ones. The older algorithms that remain popular do tend to stick around.


Then some mechanism of deprecation could be immensely better than removal (in software with tolerance - an image viewer or an audio editor are not the same as medical equipment firmware or banking software). It could also encourage the interested to review and update the code in case of need.

Some pieces of software are more free to grow, their value is in options (image processing), while others have value in their reliability (banking).

This said, one phenomenon that appeared in the "unfortunate decade" and which is extremely dubious is that of the propaganda that "less is more". No it is not. Freedom is "more". Constraints are "less". (In the general reality of desktop software applications.)


> there are fades in the interface that cannot seem to be disabled. Unjustifiable.

That's an accessibility option. GNOME Settings -> Accessibility -> Enable Animations; or `gsettings set org.gnome.desktop.interface enable-animations false`. Should apply everywhere.


> org.gnome.desktop.interface enable-animations

Yes, but it does not disable the fades, nor all the gratuitous animations I see in the GUI of gnome-font-viewer (one of the very few pieces I can try not having uninstalled it). Those animations must have been hardcoded in the software, it must probably be recompiled - and I do not take it for granted they can be disabled through a configuration flag.


I just tried it and it disabled the fades. Maybe you made a typo in the setting? Most GNOME apps are not going to have animations hardcoded, they typically use standard widgets like GtkStack or GtkRevealer that handle the animations automatically and respect the system setting. You can verify this by opening the GTK inspector in Font Viewer.


> Maybe you made a typo in the setting

Before posting I tried `gsettings get org.gnome.desktop.interface enable-animations` and it returns 'false', consistently (as it should) with `dconf-editor` used to see the options. And I verified (as I just re-did) the behaviour of `gnome-font-viewer`: changing the window content (switching the font) fades (messily) from former to latter, using the menu animates a "pull-down", activating the search slides down the interface...

I will try the GTK inspector today.


You are absolutely correct that Gnome 3.0 was bad. Even more importantly there isn't anything whatsoever inherently with going your own way. Doing so is ultimately how we end up with a useful marketplace of tools to pick from.

If you stop worrying about the victimless crime of fragmentation neither gnome sucking or the specific sequence of development matters.

For what its worth I think they date from around the same time.

https://launchpad.net/snapcraft/+milestones September 2015

https://en.wikipedia.org/wiki/Flatpak September 2015

The larger issue would seem to be that canonical has a habit of betting on losers it ultimately abandons. Snap looks like the next roadkill to me. Far from being universal it will probably never see substantial use outside Ubuntu.

It suggests perhaps that they should exercise better judgement.


> It suggests perhaps that they should exercise better judgement.

I have a feeling many of these decisions are driven by internal politics than proper analysis. The first question when you decide to go your own way is whether others will find this more useful than the competition. Build better tools is not sufficient if you don’t invest in evangelising and great branding.


The fontconfig patches are a big reason I stay with Ubuntu. Fonts look _so_ much better on Ubuntu than Arch/etc.


You can install their patched version of fontconfig on Arch

- https://aur.archlinux.org/packages/fontconfig-ubuntu/

Then you can use their font config confs

Here is the list they are using: https://hastebin.com/fukibunuji.conf

Copy paste what you have in your ubuntu install, then believe me, fonts will look similar!

Make sure to also install the ubuntu font familly

https://archlinux.org/packages/community/any/ttf-ubuntu-font...

And set your font hinting to: slight

Screenshots:

https://i.imgur.com/lKgvBeG.png

Looks good even on YouTube with Dark Theme:

https://i.imgur.com/dohPxoL.png


agreed, Redhat developers are behind many foundational packages (longterm thinking?), but I credit Ubuntu with bringing many users into Linux. They continue this with the WSL stuff.. while I see how distribution non-cooperation is difficult, at least the users of Redhat and Ubuntu-derived distributions have vastly more in common than what they differ by.


Note that there's also streamlit [1]. It uses regular python files, rather than notebooks, so they can be easily version controlled. And it has more UI tools.

[1]: https://streamlit.io/


Exceptionally strong recommendation for streamlit from me.

I can create a GUI for a tool that looks nice faster than I can make a CLI. I've built useful production systems (ok, sure, for internal use) in literally minutes.

You're a bit limited in what kinds of apps you can make but the tradeoffs it makes here means that it's astoundingly easy to make a wide range of very useful tools.


I wasn't aware of this so thanks for sharing. I've been setting up a repo that utilises github actions to build exe/app files as noted in this guys blog...

https://data-dive.com/multi-os-deployment-in-cloud-using-pyi...

It uses pyinstaller to build and even pushes the build as a zip into your release page on github and appears to be working quite well.


Streamlit is great for demos but not for building a product.


Notebooks aren't great for building a product, either.


I am waiting for JetBrain’s DataSpell which supposedly combined notebooks with IDE goodies.

https://www.jetbrains.com/dataspell/


Indeed! I find working in notebooks to be singularly unpleasant, mainly because of lack of full IDE support. Others have deeper reasons to dislike notebooks:

https://news.ycombinator.com/item?id=19859913

As a note to OP, messaging your solution as “turn your notebook into an app” may not be optimal — you will loose many who abhor working in notebooks.


Just trying it out now, but why is it not good for products?


The main appeal is the low effort to coolness ratio. But layouts are limited and you hit a wall if you try to implement even simple interactions. State management used to be rough but maybe it has improved lately


In my daily routine we are using streamlit and it is pretty decent, mainly because you do not have to care much about backend. And, what was mention by you it has impressive amount of UI tools and relatively active community.


This is great. I've been looking for the equivalent of RShiny in the python world and never heard of streamlit before


We've been using Zoom for the past 2 years and their Linux support is outstanding. Not the prettiest of UIs, but very reliable.


Can second that. Zoom works on Windows/Linux/Mac.


I have another HiDPI laptop with the same wireless card (Intel 7260). I don't think your router is the problem, I'm experiencing the same issues. When I'm not in the same room as the router signal used to drop 50% of the time. Few driver updates later things are much better, but it still freezes sometimes. Consider disabling the power saving options - it does help. It seems Intel totally botched it with this card as the Internet is littered with people complaining about it, including Windows users. It seems the Windows drivers have it covered now, but the Linux ones are still catching up.

Regarding the HiDPI - Chrome is the only thing that doesn't play, but since I'm a Firefox user, that's not a problem. Gnome 3 handles it quite well. Yes, some apps (i.e. Skype) get their font rendering a bit wrong, but otherwise it's fine. But the crisp fonts are something I'm not giving up.


That's exactly what I was seeing. If I take the laptop out of the room with the router I start instantly having trouble. Just being one room away, sometimes I can't connect at all.

Even sitting right next to the router, my bars aren't full which is really strange.

I'll try fiddling with the power saving options. Thanks for the advice.


For posterity's sake... I kept having bad issues with the wifi. The problem seemed to get worse after the computer went to sleep. It would take websites a long time to load and drop out completely sometimes. I ended up getting a new router since I saw on a couple of forums that it helped. Specifically I got a Linksys WRT AC1900. The problems so far seem to have been resolved. Websites load quickly and no cut-outs so far.


I actually wonder if it can be done even more elegant with RxJS?


Thanks for the heads up about RxJS, didn't know that existed.

Seeing as one of the benefits of React is in server-side rendering, and now with the new OSS nature of .NET, it's worth considering standard Rx as well ( https://msdn.microsoft.com/en-gb/data/gg577609.aspx ).

Rx is very new to me, so I may have this description wrong, but it essentially seems to be well-supported event-driven programming library. Worth a look if you're interested in FRP.


Rx isn't FRP, and doesn't really solve the same problem that react does (view management). Its grew for pushing discrete events around, but doesn't provide for any signals and continuous binding.


Rx programming is so great. It's one of those things that you seldom find a use case for but when you do it leads to 50x easier and more readable code. It truly is for events what Linq was for collections Mostly only useful with more advanced ui and network programming though


You may want to take a look at elm-html.


Also Cycle (virtual-dom + Rx)


I think RxJs[1] can help here as well.

[1]: https://github.com/Reactive-Extensions/RxJS


I wonder if HiDPI support for Linux (and Windows) finally works.

Edit: just grabbed the community edition and the answer is... no. Even with -Dis.hidpi=true - no luck.


Looks like Oracle is not going to support it out of the box in upcoming JREs, so we're going to do it by ourselves in our custom jdk.


Can you give any more details about this? Are you going to be trying to fix a lot of the bugs on OSX on JDK > 6 too?


Is this only an issue on Linux and Windows? I'm running Java 8u25 on OSX without an issue. Java 7u40 introduced high dpi support [1], but I didn't know it was only a fix for OSX.

[1]: http://bugs.java.com/view_bug.do?bug_id=8009754


I really hope it does. IntelliJ is the main reason why I didn't buy a HiDPI laptop yet (Chromium also has some issues, but Firefox works great).


Try Mac Book Pro with Retina. Works perfectly


Looks crystal clear on my retina macbook pro running yosemite. Might be an OSX specific fix however since HiDPI is way more common in the mac world.


Works fine for me on Linux. Some icons are too small, but apart from this, no complaints.


Could you share your setup - Distro, Desktop Environment, settings, etc.


try with JDK 1.7 (Java 7)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: