Hacker Newsnew | past | comments | ask | show | jobs | submit | cxr's commentslogin

> There are only two ways to make money. One is to bundle; the other is to unbundle. —Jim Barksdale

The original (or just the Firefox 3-era Places revamp?) bookmarks implementation in Firefox had a multi-line field to jot down a personal note or add a description of the page. Even bookmarks folders were allowed to have descriptions—see the New Folder dialog in this screencapture at ~1 minute in <https://youtu.be/QoJXmLuGM3s?t=60>

The "Description" field was removed in 2018. <https://bugzilla.mozilla.org/show_bug.cgi?id=1463738> Being able to capture and redirect the resources that would have inevitably gone into the maintenance costs of this feature over the last 8 years is likely to be the reason Mozcorp has been able to stay afloat on their meager budget that they have to carve out of their half-a-billion-dollar revenue stream stream year after year. Giving users uninterrupted access to Descriptions over that amount of time would likely have bankrupted them.


> Can you please elaborate on this?

You're replying to an LLM-powered comment generator.


> I don't understand why you need to switch out the VCS to fix that issue.

For some reason, when it comes to this subject, most people don't think about the problem as much as they think they've thought about it.

I recently listened to an episode on a well-liked and respected podcast featuring a guest there to talk about version control systems—including their own new one they were there to promote—and what factors make their industry different from other subfields of software development, and why a new approach to version control was needed. They came across as thoughtful but exasperated with the status quo and brought up issues worthy of consideration while mostly sticking to high-level claims. But after something like a half hour or 45 minutes into the episode, as they were preparing to descend from the high level and get into the nitty gritty of their new VCS, they made an offhand comment contrasting its abilities with Git's, referencing Git's approach/design wrt how it "stores diffs" between revisions of a file. I was bowled over.

For someone to be in that position and not have done even a cursory amount of research before embarking on a months (years) long project to design, implement, and then go on the talk circuit to present their VCS really highlighted that the familiar strain of NIH is still alive, even in the current era where it's become a norm for people to be downright resistant to writing a couple dozen lines of code themselves if there is no existing package to import from NPM/Cargo/PyPI/whatever that purports to solve the problem.


> they made an offhand comment contrasting its abilities with Git's, referencing Git's approach/design wrt how it "stores diffs" between revisions of a file. I was bowled over.

It seems like you have taken offense to the phrase "stores diffs", but I'm not sure why. I understand how commit snapshots and packfiles work, and the way delta compression works in packfiles might lead me to calling it "storing diffs" in a colloquial setting.


> It seems like you have taken offense to the phrase "stores diffs", but I'm not sure why.

Yeah, I'm not sure why it seems that way to you, either.

> the way delta compression works in packfiles might lead me to calling it "storing diffs" in a colloquial setting

We're not discussing some fragment of some historical artifact, one part of a larger manuscript that has been lost or destroyed, with us left at best trying to guess what they meant based on the little that we do have, which amounts to nothing more than the words that you're focusing on here.

Their remarks were situated within a context, and they went on to speak for another hour and a half about the topic. The fullness of that context—which was the basis of my decision to comment—involved that person's very real and very evident overriding familiarity with non-DVCS systems that predate Git and that familiarity being treated as a substitute for being knowledgeable about how Git itself works when discussing it in a conversation about the tradeoffs that different version control systems force you to make.


A common misconception is that git works with diffs as a primary representation of patches, and that's the implied reading of "stores diffs". Yes, git uses diffs as an optimisation for storage but the underlying model is always that of storing whole trees (DAGs of trees, even), so someone talking about it storing diffs is missing something fundamental. Even renames are rederived regularly and not stored as such.

However, context would matter and wasn't provided - without it, we're just guessing.


The problem with comments like these is that guessing what "better language" a commentator has in mind is always an exercise left up to the reader. And that tends to be by design—it's great for potshots and punditry, because it means not having make a concrete commitment to anything that might similarly be confronted and torn apart in the replies—like if the "better language" alluded to is C (and it generally is)—the language where the standard library "steers" you towards quadratic string operations because the default/natural way to refer to a string's length is O(n).

Maybe if the Jolla folks were serious about making inroads in the market for personal mobile devices that they're ostensibly trying to compete in. But they're just as deluded and as doomed as their Meego/Maemo/Moblin predecessors about the value proposition that the SDKs and system software they ship has with the market segment they're targeting.

It sounds like you're not grasping the meaning of the linguistic construction being used by the person you're quoting. (Or you're being deliberately deceptive about your understanding of their intent. But it's probably just the former. I'm guessing you're ESL.)

"Ruining Android for everyone" ("to try to maybe help some") does not mean, "Android is now ruined for X, for all X." It means, perhaps confusingly, pretty much the opposite.

It means: "There exists some X for which Android is now ruined (because Google is trying to protect Y, for all Y)." (Yes, really. The way the other person phrased it is the right way way to phrase it—or, at least, it's a valid way to phrase it.)


> cross-platform GUI only looks good on the platform that it was originally designed for

Formulated more rigorously, cross-platform GUIs and outsider, non-Mac-first GUIs ported to Mac OS look (and feel) bad on Mac. The opposite is virtually never true though; there aren't really high standards for beauty or consistency on the other platforms. Windows, for example, in this decade is a mishmash of different toolkits (even from Microsoft). Desktop GNU/Linux people comprise a faction consisting of people that either doesn't care about GUI beauty or have standards that are about on par with Windows folks—and are generally so grateful just have an app that ships* their platform, that they won't reject outright any Mac-first app (and that would be true even if it painted itself as a pixel-for-pixel match of the Mac OS version).

* and runs; I still run into "cross-platform" apps that are Electron builds packaged as AppImages that still terminate at launch, even if you try to run them on something as unremarkable as Ubuntu


> The opposite is virtually never true though; there aren't really high standards for beauty or consistency on the other platforms

The beauty is in the eye of the beholder. I don't find Apple UI beautiful and consistency is missing in some places at least on iOS.


I don't think that's actually true. OS X really wants more whitespace between controls than other OSes do. If you just naively port an OS X dialog to Windows or Linux, it'll seem too spread-out.

And what are your thoughts on Frito chili pie?

> A portable GUI interface is a hard problem, unless you mean "a browser window without an URL bar" and your controls are HTML/CSS components[…]

> […] an abstraction layer on top of several genuinely different systems[…]

> […] wxWidgets is perhaps the closest we can get[…]

For good reasons—because you'll likely exhaust yourself or starve of resources before you finish the project, and because it also doesn't tend to help with the readability of the codebase—it should be ingrained in all programmers to be strongly against "portability" approaches that try to make platform X work like platform Y by providing implementations of Y APIs that work on platform X. (See also yesterday's news by Wasmer about their AI-coded approach[1].) The goal is almost always better achieved by defining, up front, the minimum interface that the program needs to be able to work on the host—which forms the kind of abstraction you're talking about—and then connecting that to the actual host APIs (which usually suck to use, anyway), repeating for each platform. Almost.

Desktop UI toolkit APIs are one exception to this—which is unfortunate, because it's like the one use case where people actually try to do the opposite of the usual impulse, and we're all worse off for it.

The major desktop platforms are so few, and the APIs are so stable and so slow-moving, that it's well past the point where the cross-platform native app solution should have been adopting/implementing the platform Y APIs everywhere, where "platform Y" is Cocoa[2]. Like, at the latest, the Oracle v. Google decision in 2021 should have been the last hurdle. People get weird about this, though, and confuse APIs with implementations, or conflate Cocoa with Objective-C (and/or a mandate to program in Objective-C), or think this amounts to a claim that this means we should all be using GNUStep. It's not. It's not any of things. It's very simple: the window-and-widget APIs across all desktops should be Cocoa-shaped. That's it.

1. <https://wasmer.io/posts/edgejs-safe-nodejs-using-wasm-sandbo...>

2. <https://news.ycombinator.com/item?id=30359206>


It doesn't look like they're "trying to say" anything. They said it.

They opened the homepage and heard and/or felt their fans rev up, which didn't leave a good impression. They don't have confidence that your product/service is worth paying for (perhaps not worth using even if it were free).

There's nothing else to figure out.


> Would you voluntarily negotiate a pay cut just because you can charge a fraction of what you do and still swim in cash

You're posing the question like there's an obvious answer, and that that answer is "no". In reality, all kinds of people do this all the time.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: