The thing is that they may have an amazing breakthrough - but at least three are needed to make a workable quantum computer (time for decoherence, control/measurement, error correction).
Such nonsense hype they didn't even bother to offer any case for how they surmounted the 40 different technical revolutions needed to achieve actual error correction.
I will have to disagree with you on the merits of the company (keeping in mind that this is just a press release, not a journal paper or pop-sci article). I work in this field, and personally, I am more excited about PsiQuantum than about Google's or IBM's transmon devices. If you would like to get the technical details, I would suggest checking their APS March Meeting conference talk (it is recorded and available online). They plan to use the "cluster state" model of quantum computation that has some significant advantages when scaling up (easier to scale up, even if difficult to produce the initial prototype). Check out "Why I am optimistic about the silicon-photonic route to quantum computing" by Terry Rudolph (2016) for more details.
Personally, I feel PsiQuantum is the only company that does not talk about nonsensical near-term quantum applications and quantum machine learning and quantum optimization and other questionable applications of quantum computing. They are pushing hard for the creation of actual quantum computers.
Edit: I am probably a bit biased, as a lot of my work does indeed involve photonic hardware.
How does their approach compare with Xanadu's? They recently outlined a blueprint for photonic QC in Nature[1], but I've seen quite some skepticism toward the company in online circles.
They are both photonic modes of quantum computing, that use some similar engineering, but deeply different principles.
In the PsiQuantum hardware (cluster state computing) the non-classical effects come from using single photons, not pseudo-classical (a.k.a. coherent) states. It is very difficult to make consistent single photons of exactly the same wavelength and it is difficult to detect them with extremely high reliability, but the rest of the engineering is "easy".
In the Xanadu hardware (continuous variable computing) the non-classical effects come from "non-Gaussian" operations (operations that are not based on the "Gaussian" beamsplitters, wave plates, and squeezing). They do not need single-photon states of light, they can start with coherent states. Coincidentally, the "Gaussian" operations and coherent states are the easy ones, but obtaining "non-Gaussian" resources is just as difficult (but difficult in a different way) as obtaining consistent single photon states.
At the end both of these low-level-hardware approaches will be abstracted in a nice programmable interface that just gives you an abstract quantum computer (e.g. based on the most well known "gate model"), but the low-level implementation is as different as it gets.
Lastly, there are teams that try to directly implement the gate model (IBM, Google, Yale, UCSB, etc), teams that try fancy topologically protected models (Microsoft, paper retractions notwithstanding), adiabatic quantum computing (D-Wave, but they were snake-oil salesmen for a bit and lost a lot of good will in the academic community).
And there is a ton of cross-pollination between the different teams and different (but computationally equivalent) models.