Hacker Newsnew | past | comments | ask | show | jobs | submit | emusan's commentslogin

Blue Origin has not sent a rocket to mars in the sense that SpaceX wishes to send Starships to mars. They have sent a probe. SpaceX has launched probes to far further celestial bodies than Mars.


Starship will never go to Mars. It's very unlikely it will go to the Moon.


I have said this for years. Starship will eventually go to orbit, it MIGHT go a few times to the Moon. It will lucky if it ever makes it to Mars.

More than happy to be proven wrong. I mean they are still progressing but it is just a case of figuring out how long their runway is (economics).


Anyone who is paying attention knows that Starship is mostly going to be a launch vehicle for Starlink. It's very unlikely that the upper stage will ever support external payloads.


Why wouldn't they make it for external payload if they get the cost per kg lower than F9? Running starship only is going to be cheaper than running both rockets, except if the economics of starship are worse (in which case, it would not be used for starlink either).


Can you provide your logic for this conclusion?


Cmon. Don’t kill my dream. I dream of Elon musk flying to Mars. And staying there.


Oops. Earth's space connection to X just went down. We expect service to resume in about one martian lifetime.


He hasn't even been to orbit.


i would be sooooooo sad if we get a challenger #2 while sending musk up. depends on if he's the only one on board.


... but alone. We don't want some Expanse-like scenario down the line with fascist part of mankind completely unhinged. Once he is over then colonize all you want.


Have you bet on that on some betting market? I'd like to take that bet.


I have not, but I just checked and the odds for HLS moon landing before 2028 are at 12%.

https://kalshi.com/markets/kxmoon/nasa-lands-on-the-moon/moo...


12% odds for 3 years seems fairly resonable for a manned landing.

Your statement of "Starship will never go to Mars. It's very unlikely it will go to the Moon" which sounds like it includes even unmanned test landnings is a quite different beast.


There are more possible bets on manifold, you do you.

I'm not really a betting man, but given the HLS budget is spent and most of technology is not nearly developed I'd say even an unmanned Moon landing is at least 5 years and $10 billion away and Mars is pure fantasy.


> but given the HLS budget is spent

What does that mean? Starship is basically self-funded by SpaceX and the amount of money they got for the HLS contract is something they blew way past even before the contract, that doesn't make much sense.


They have copied the external look of the rockets (which is, for obvious reasons, not protected by ITAR), but to my knowledge no one, China or otherwise, has managed to match the internal engineering prowess nearly a decade later.

To say that China is "very effective at copying all the stuff SpaceX is doing" is quite the stretch.


Blue Origin has finally turned it around and is getting close to matching SpaceX.


I sure hope so, but also caution (admittedly as an outsider, your posts suggest greater familiarity than I have) that there's not enough evidence yet to be confident of that — finally getting to launch is really impressive, but their first orbital capable rocket is still a sample size of 1.

(Counter point is that New Glenn's one orbital launch is more impressive than all of the Starship launches combined, as SpaceX is willing to burn hardware during tests and they have many more Starship tests to go).


I know people at SpaceX, but I don't know anyone at Blue Origin. We're all very curious to know what has changed, as they very clearly have made changes. There was a large shakeup in company management there in late 2023, and the turnaround on timeline was stunning. They are certainly moving very, very fast now.


These are quite often used in defense and space applications where the flexibility of the DSP allows for custom waveform implementations that would otherwise require incredible CPU processing power or small batch ASICs. The versal fabric will only expand the potential use cases even further in these domains. Cost is often lower on the priority list for these as well.


Yes, I have seen versal, in particular in defense and satcom. However, in just that field, I have also never seen an RFSoC.

I've seen lots of integrated RF transceivers that were tightly coupled to the FPGAs, but not shared on the same SoC.


Is that because defense doesn't like them or is it because (non-wartime) defense moves on geological timescales and these are "new"?


As compact-ish explanation: A "standard" wideband RF system in an EW or RF reconnaissance platform covers between 0-18 GHz (DC up to the Ku band), or at least as much of it as possible (and Ka/mmW becoming common on new systems); and they have challenging requirements compared to a communication system. Communication systems are simpler to design since both sides of the link cooperate and filter out a wide swath of potentially interfering signals, but a military system wants to see as many signals as possible so they can be collected or jammed. It has not been advantageous to use an integrated RFSoC in the past given the requirements. If a company were spending millions of dollars designing a complicated front end, they might as well pick a separate ADC/DAC that maximizes the performance they cared about, rather than go with the "easy" integrated RFSoC option that might not have the absolute best performance. Now the industry is just getting to the point that a system like a direct sampling ADC/DAC integrated into a Versal might be able to process massive bandwidths at high enough bit rates that they can do useful things for military applications; it may actually be worth it now because you can push to really high data rates, and the additional processing might make up for a small loss in ADC/DAC performance. Give it a couple years for these to make it into new designs and get fielded.

So I guess the tl;dr, is that it is not because defense doesn't like integrated packages, they just haven't been worth it considering the design goals. Defense does move slow, but this is more about being able to field "military-grade" solutions that work well in challenging RF environments, and once that is possible the government will start to pay for it.


Which comparably-priced ADC/DAC ICs are pushing 6 GSPS on 8x8 channels like the $500 (actual price, not fuck-you DigiKey price) RFSoCs?


Yeah, we got a dev board of one of the higher-end RFSoc chips for a project we were doing in high-frequency trading (on the networking side over microwave), and we had to jump through a bunch of hoops for approval since they are mostly intended for defence.

Lot of phased-array radar and electronic warfare applications.


Anduril Intelligence and Space (https://www.anduril.com/) | Reston, VA (onsite, relocation available) | Full Time

Anduril Industries is a defense technology company with a mission to transform U.S. and allied military capabilities with advanced technology.

Anduril Intelligence and Space (AIS) is focused on positioning Anduril as a lead provider of specialized engineering and products for Intelligence Community (IC) customers.

In particular we are looking for engineers with experience in:

- Embedded Software Development (bonus points with Rust or Haskell experience!): https://job-boards.greenhouse.io/andurilindustries/jobs/4597...

- FPGA Design: https://job-boards.greenhouse.io/andurilindustries/jobs/4591...

- Hardware Design: https://job-boards.greenhouse.io/andurilindustries/jobs/4591...

Apply here: https://www.anduril.com/open-roles/

Feel free to reply or email me with any questions!


There is hope for us lowly humans!


This is a difficult question to prescribe an answer for that works for everyone, but the best I personally can think of is "practice".

To make that more actionable... My approach in life has generally been to find a project (even something seemingly incredibly dumb, as long as it is fun), then work through it, learning what I need to know as I go along. To learn "well", you must then also constantly question what you have done as you complete various stages of the project to see if you have done them as effectively as possible, and try to incorporate any lessons learned into future projects.

I have found that how individuals do the learning required for this differs significantly from person to person, so it is hard to recommend any particular approach.


There are a few factors that make this possible:

1. As others have pointed out, the link budget (how much energy loss a particular radio link can handle before it is broken) for D2C satellites assumes a nearly direct line of sight from your handset to the satellite. This is much easier to achieve with satellites in space than it is with traditional cell towers that might have numerous walls/buildings in the way.

2. The D2C satellites use massive phased array antennas that are able to point a very narrow beam very accurately to the ground. This provides a substantial amount of antenna gain that further helps the link budget. The gain from the antennas allows the satellites to pick up even relatively weak signals from a handset.

There are other tricks as well, but these account for the largest differences. Of course, doppler gets in the way, but it is a solvable problem.


So the satellite can point a narrow beam. How does it handle multiple connections. Can it aim 1000s of beams at once?


In theory, yes. Phased arrays can steer as many independent beams as the connected electronics support. I real life, it's probably going to be dozens or maybe hundreds of beams.


I guess the narrow beam, covers quite a bit of area on earth.

I guess the "narrow" in the current context is the beam widening to hundreds of miles on earth.


IIRC the current Starlink beams are of order 10 miles on the ground. So much narrower than you guess.


Anduril Intelligence and Space (https://www.anduril.com/) | Reston, VA (onsite, relocation available) | Full Time

Anduril Industries is a defense technology company with a mission to transform U.S. and allied military capabilities with advanced technology.

Anduril Intelligence and Space (AIS) is focused on positioning Anduril as a lead provider of specialized engineering and products for Intelligence Community (IC) customers.

In particular we are looking for engineers with experience in:

- Embedded Software Development (bonus points with Rust or Haskell experience!): https://job-boards.greenhouse.io/andurilindustries/jobs/4415...

- FPGA Design: https://job-boards.greenhouse.io/andurilindustries/jobs/4367...

- Hardware Design: https://job-boards.greenhouse.io/andurilindustries/jobs/4367...

Apply here: https://www.anduril.com/open-roles/

Feel free to reply or email me with any questions!


Curious to know what app you are using to get boundaries and layers, OnX?


GaiaGPS: https://www.gaiagps.com/

Like with most apps the layers are only available under paid plans but I'm grandfathered in on an old plan.


I think “low speed” is quite a relative term here. PCIe serdes lanes are for very high data rate communication (> 1gbps per lane). This is the realm of the Syzygy XCVR standard.

The lower speed Syzygy standard, while not operating at these speeds, is capable of much higher rates than a typical microcontroller. There are many peripherals with I/O requirements beyond a simple LED or SPI device, but below that of a PCIe or other high rate transceiver such has:

- moderate to high end ADCs and DACs (LVDS and parallel)

- image sensors (LVDS, parallel, and HiSPI)

- various networking PHYs

The lower end syzygy connector has pinouts to support LVDS differential pairs, which can easily achieve hundreds of Mbps data rates.


Piggy backing on this, image sensors using CSI are quite common. I don't know if anyone has this application, but theoretically if you wanted more than a few (I think even higher end processors cap out at 4) video streams in comes.... FPGAs. Maybe the newer Qualcomm XR chipsets can deal with that but an FPGA seems more attainable.


Qualcomm has supported CSI virtual channels for ages - you can get a little 4:1 bridge IC to renumber streams. IIRC, the 855 had 4 CSI IFEs, and would happily handle a 16x configuration if you bought the wide memory bus.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: