The real issue is the lack of support. Real users buy those support packages from vendors. They bring their PC in to be fixed at their local shop. They might Google a problem and find a solution occasionally of they are feeling spicy but how often do you get screenshots to get something to work in a Linux GUI? Web browser only laptops are great until your uncle gets a "killer deal" on some random printer on Facebook marketplace and they can't get it to work. Or a webcam. Or a Bluetooth headset. Or a game controller. Scanner. etc etc.
On top of all of this, they will just give up and buy a new machine and return it if that doesn't fix their issue.
Linux provides virtually nothing on any of those fronts unless you get a private level 8 tech support contact provided by your grandson. Who wants to be 24/7 on call for their extended family?
Regarding slow startups, I am not sure this applies to any use cases I can think of where it would not also be a concern in python, etc. JVM startup times have never meaningfully impacted my workflow in the last 15 years.
The why is quite simple, in my opinion. I see java devs reaching for other accepted tools for such things and opening a whole can of worms by introducing a new language that is only "required" by convention. I would love a rich java ecosystem of TUI/CLI libraries to reuse all of my existing business logic and company libraries. The lack of extremely streamlined wrappers is the only barrier. In my work environment, this would be a great addition.
Right, as I said I also think Python has similar issues with startup time.
I've used a limited amount of Java CLIs, the most obvious ones are things like Gradle which never felt snappy to use - it is annoying when even doing basic things takes 2+ seconds. I guess not the end of the world, but that seems suboptimal to me compared to a system that feels fast to use and hence well engineered.
Alternatively, companies will just go back to corporate cards with very strict vendor rules. When vendor rules are not enough, they will go back to extremely tight per diem and simply not care of the expense was strictly real or not. I can't see any advantage to using a whole new currency and exchange that every vendor in the world is now going to have to support for this to be useful vs these simple measures.
I'm not sure either, it's mostly a guess I guess, more than a prediction. But if companies/governments can get more tracking, usually that's the way they move.
So if the companies that are complaining in the article implements your simple measures, does that mean the fraud problem they're talking about have been solved already? That begs the question, why aren't they just doing that then?
The problem isn't "spending above/below per diem" as I understand the article. The problem is that whatever spending is happening, might be fake, that's why they're complaining about "faking expense receipts".
So it does seem like the companies themselves do worry about if what you spend it on is legit or not, it's almost the entire purpose of the article unless I misunderstand something?
In my experience, this form of expense management is a relatively new development. Not so long ago, I had a company linked corporate card. Credit card transactions are already labeled with the type of purchase. You have to submit a report that corresponds to the actual charged amount. Additionally, they get all the bills with their hotel partner to cross reference the transactions with the credit card and the submitted expense. Flights etc worked the same way. Those are now additionally tracked in the company travel relationship portals which accomplish the block chain without the block chain. None of this requires a global immutable currency ledger or anything like that to accomplish their goal: just get some reasonable transaction validation long enough to process the expense then never look at it past an audit. Later, they also made people eat from the same hotel and negate the issue for meal expenses. It's just not a technology problem. If it were, they would just demand more granularity from the credit card company and the employee and reject things out of policy.
I get where you are coming from. However, language like this matters when it comes to legislation. People outside there space will be guided by the sideload language to think it's just "something extra on the side so why should I care?"
I understand what sideloading means, as I'm sure the rest of HN knows. But to the layman non-techie, it has indeed been marketed as a boogeyman.
Even in the Android developers blog post:
> We’ve seen how malicious actors hide behind anonymity to harm users by impersonating developers and using their brand image to create convincing fake apps. The scale of this threat is significant: our recent analysis found over 50 times more malware from internet-sideloaded sources than on apps available through Google Play.
The research paper that shows their methodology for discovering these results AHS not been published by Google, to my knowledge. Just a mere "trust me, bro".
But even if they used the term installing apps instead of sideload, what other word would they use? If they said "50 times more malware from internet sources than on apps through Play Store" people will still come up with their own wording.
If they use the word install apps, they would need to say installing apps from outside of the play store, in which case people are going to automatically try to come up with a different word to associate that with. Any word we come up with is going to be subject to being used for the good and the bad.
I’m giving the Asus Proart P16 a try, if I was trying for maximum battery life I’d probably avoid the discrete gpu (e.g. a zenbook or similar). I am not a full time road warrior so battery life is important, but not absolutely critical. There may be some options out with better battery life, but I haven’t been focused on this metric too closely.
I just don’t want to be sidetracked by overly aggressive and pushy decisions by my OS vendor anymore. I’ve been happy with the stability of my Debian/Ubuntu systems for getting work done. I have been using apple laptops for 10+ years, and I still like their build quality, but I don’t like the direction macOS (or Windows) is heading.
I have extreme doubts that there is any meaningful number of windows users holding out on trying macos based on such a thing.
The users are much more simple than this. Most have never even tried Mac. If they want to, they will just buy one the next time they need a computer and accept the new experience as being the new norm.
I got stuck in a fullscreen YouTube video the first time I tried an iPhone. Simplicity is relative. For years, the lack of a back button resulted in this weird behavior of having to learn how each app wants you to navigate it. Even now that everyone has settled into the same list of 5ish methodologies, it can be cumbersome to figure out.
I can see how these things are convenient, if it succeeds. I struggle because my personal workflow is to always keep two copies of a repo up at once. One is deep thought vs drone work. I have always just done these kinds of background tasks whenever I am in meetings, compiling etc. I haver not seen much productivity boost due to this. oddly, you would think being able to further offload during that time would help, but reviewing the agent output ends up being far more costly (and makes the context switch significantly harder, for some reason). It's just not proving to be useful consistently, for me.
On top of all of this, they will just give up and buy a new machine and return it if that doesn't fix their issue.
Linux provides virtually nothing on any of those fronts unless you get a private level 8 tech support contact provided by your grandson. Who wants to be 24/7 on call for their extended family?
reply