I believe they're showing an example of where people would use (incorrect) regex patterns to validate. That's the whole thrust of the previous paragraph. `hadc0ffee` is not valid hex, but the function says it is due to, I presume, an incorrect regex pattern. I'm not familiar enough with regex to know why it's wrong though.
Edit: I was curious, so I looked through the linked docs for regex. They have the exact same pattern for checking hex in there as they do in this one. I guess it was just an error after all?
Does anybody know of a good place to get seasonality information for fruits and veggies?
I found an open source application[1] for showing that stuff, and have been looking around to be able to add information for the North America region. [2]
The FDA database is good and this one is too, but they don't offer any seasonality about fruits or veg :/
I work in an organization that has bought support for Java (11, I think). Your comment is technically correct, but doesn't capture the whole situation I believe.
We use OpenJDK11 currently. The reason we're buying support is because bug fixes and backports END for JDK11 once JDK12 is (already was) released. If you want security/bug fixes, you'll need to upgrade your application to the new JDK every 6 months. For a team like ours (~8) people, it's simply not viable for us to be in a state of continual upgrading. We wouldn't get anything else accomplished. Our job is to provide value to the business, not be stuck in perpetual runtime upgrades. Buying the license allows us time between upgrading JDK versions so that we can target jdk11 for new applications.
For quite some time after jdk11 was released there was also a prevalent opinion by some of the more senior members on the team that Oracle was going to be gimping the OpenJDK release in some way or another. Even after being shown that that wasn't the case we had _multiple_ meetings where it was discussed that we shouldn't migrate to OpenJDK because of bugs that "existed in there, that don't in Oracle build". This is most certainly due to historical reasons, and I don't blame them for thinking it.
I can't speak to why we don't rely on other LTS options, I was not a part of the meetings that decided our actions. Ideally, we would rely on something like adoptopenjdk, but the powers that be decided to pay Oracle instead.
> If you want security/bug fixes, you'll need to upgrade your application to the new JDK every 6 months. For a team like ours (~8) people, it's simply not viable for us to be in a state of continual upgrading.
You need to upgrade to a new JDK every couple of months if you want to stay secure even if you stay on LTS. You are simply assuming that upgrading to patches is easier than upgrading to feature releases. I don't believe this is true, certainly not over a long duration.
The new feature releases are time-based and were originally meant to be called 18.3, 18.9, 19.3 etc. Don't be confused by the fact that eventually people preferred the Chrome-like version naming scheme to mean that the new feature releases are major ones; they are not. In fact, the probability that your application would break by a patch release and by a feature release is not all that different.
We want the feature release process to be gradual and cheapest overall. The LTS path is meant for those who for some reason need a less gradual process than either the feature releases or the old model.
Of course, as an Oracle employee, I value your business, and I hope you choose to buy our support (that includes more than patches) even if you switch to the gradual update process.
Ideally if I had any say in the matter, I'd also choose to continue the upgrades. I'm unfortunately not in a position to be able to make decisions for the company, or even team really.
The culture for us leads to more problems imo. Tests aren't required. We have > 100 legacy applications (most C#, but some java). I'm currently working on porting a webapp developed 5 years ago, written on jdk6 (I think) using Hibernate 4.x.y, to java 11. There's no tests for the webapp. The majority of the code was written by contractors. I've removed hundreds (if not 1-2 thousand) of lines of commented out or dead code before I even started refactoring. In this case, which I agree is probably an outlier, updating takes too long. Or maybe I'm too inexperienced.
In my managers eyes, they see how long it's taking to migrate to JDK 11, and they don't want that for every new "major release" (12, 13, etc)
> In my managers eyes, they see how long it's taking to migrate to JDK 11, and they don't want that for every new "major release"
How long to manage from what version? There was one huge major version between 8 and 11 -- 9, the last one ever. So if you go from 8 to 11, then you're basically doing one major release, plus two feature ones.
> Tests aren't required.
That's unfortunate, because even patch releases can break your application (and possibly at a probability that's not significantly lower than that for a feature release, although they are both low). In fact, this happened in 11.0.2/8u201[1], which was possibly more disruptive than any of the recent feature releases. Both feature and patch releases require the same amount of testing on update. If someone thinks they don't, then they are very deeply misguided.
Getting support is good, but having an understanding of the platform is also good. Having good tests is probably better than both :)
So I have been in Java versioning hell before, and I empathize--but that was, like, Java 5 to 6. My experiences with Java since then have been almost completely seamless. So, this may be a silly question, but I'd be curious: what is so disruptive to your workflow that upgrading a JDK, running your test suite under instrumentation to smoke out incompatibilities and noticeable perf/memory/etc. regressions, and deploying it would cause you to not "get anything else accomplished"?
I can't speak for the OP, but from what I've seen in other organizations a reluctance to upgrade JDK versions usually indicates a lack of confidence in their test automation suite. If they can't be confident that their tests will catch regression defects then a JDK upgrade seems risky and requires planning ahead for a major manual testing effort. This is just one area where getting to 100% automated functional testing delivers huge benefits.
Pretty much right on the nose. As I mentioned elsewhere, testing is not required where I work. As such, there's no guarantee that upgrades to JDK won't break something behind the scenes now only to blow up later in production.
Most test suites cannot test long-term stability. We once had a commercial J2EE app that would run fine for weeks on JDK 7.x but commit continual-GC suicide or burn 100% of all cores after about 24 hours on JDK 7.x+1.
After this happened with multiple oracle “bug fix” releases, we of course stopped upgrading as much as possible, spending days trying to determine if each security hole being patched in the JDK was actually an attack vector for this app.
We also implemented periodic “therapeutic restarts” on all app servers.
It was like running a critical server app on Windows 95.
Major java versions have been known to have different default settings/properties from previous versions, without any mention whatsoever in the release notes.
Any 'minor' change in a user application, font being a common example, can have a significant negative impact for an enterprise that has built workflows around that application.
That assumes we have (any) tests, or benchmarks to refer to. Plus, we have a decent amount of legacy java apps (java 6 - 7) and we'd be going straight to 11. On the project I'm on now, it's not a smooth transition.
Care to post what stack you’re on? We’re on Gradle and SpringBoot and JDK upgrades have been zero changes for us so far (other than changing the version number of course). The backwards compatibility has been superb - what breaks for you?
Not the OP, and our stacks/reasons are probably different, but for perspective, I work on a large Canadian Federal Government project, and a minor change in Java version on severs for our legacy application, is probably... 1-2 months elapsed time, minimum. And that's if we get the change approved.
Mostly it's the slow and manual change management practices, slow and manual approval framework, slow and manual testing approach, slow and manual implementation procedures, and other things that are slow and manual.
While gits documentation might not lay out the model at the start, they certainly have documentation showing how everything works, complete with diagrams and such. As such...
> Then it will be the user's fault if they can't be bothered to read it.
can be applied to you about git technically. I think that's just me being a "little" pedantic though.
It sounds like the issue you actually have is that the documentation isn't easily readable in one or two sittings, and you don't have the time (or can't be bothered) to go through it and learn it. Which I totally understand, everyone has different things they need to spend time on, most of the time learning Git isn't one of them.
I have read large parts of the git docs. I don't like them. This is not just the text: the low contract colours and hard-to-read fonts are also factors.
> It sounds like the issue you actually have is that the documentation isn't easily readable in one or two sittings, and you don't have the time (or can't be bothered) to go through it and learn it.
So how long should it take to learn a VCS? And how long does git take to learn?
As someone who also works at a bank (though not a large national one) I get the same feeling, though I understand it to an extent. Not complying to bank standards means you get dinged if you get audited. Getting dinged means you will most likely get your budget cut by an amount next quarter. Or get the group "reorganized".
I made an account just to comment on this link as I have personal experience in the matter.
I graduated from a relatively good school with a degree in Computer Science, yet wasn't admitted into the major until 2 weeks before graduation. I spent my entire college career in Computer Engineering while taking Computer Science courses and (thankfully) was able to complete the curriculum without actually being a part of it. I lucked out as at my school Computer Science isn't locked down like many of the other engineering majors, so anyone can take a CS course as long as they meet the pre-reqs. It would be an understatement to say the experience was harrowing.
That's so strange. If there's room in the classes (as appears to be the case), why limit the number of majors?
My school (UVA, in the late-90s) already had recently converted Computer Science as a limited enrollment major to get around this problem. Declared CS majors had preference for CS courses. It wasn't quite so bad that others were completely locked out, but if you weren't declared, you had to be quick to register and might not always get into fun electives (the core courses were usually larger and easier to find a seat).
Since that time, they've actually added a second CS major. They now offer the original BS in CS through the engineering school. And a BA in CS through the college of arts and science. The two primary differences being a foreign language requirement for the BA students, and a heavier emphasis on math and general engineering in the BS program.
Edit: I was curious, so I looked through the linked docs for regex. They have the exact same pattern for checking hex in there as they do in this one. I guess it was just an error after all?