Hacker Newsnew | past | comments | ask | show | jobs | submit | yoan9224's commentslogin

Cool concept! I built something similar (real-time visitor locations on a globe) and have a few thoughts:

1. How are you handling connection scaling? Realtime presence with many concurrent users gets expensive fast. We cap concurrent WebSocket connections and batch updates every few seconds rather than streaming every event.

2. The pin clustering looks good - are you using a quadtree or just distance-based clustering? At higher zoom levels the overlap can get chaotic.

3. One thing that worked well for us: showing a brief "trail" or fade-out when someone leaves, rather than just disappearing. Makes it feel more alive.

What's your stack? The latency feels snappy.


The comment about maintainers not getting paid resonates. I'm a solo founder and these security changes, while necessary, add real friction to shipping.

The irony is that trusted publishing pushes everyone toward GitHub Actions, which centralizes risk. If GHA gets compromised, the blast radius is enormous. Meanwhile solo devs who publish from their local machine with 2FA are arguably more secure (smaller attack surface, human in the loop) but are being pushed toward automation.

What I'd like to see: a middle ground where trusted publishing works but requires a 2FA confirmation before the publish actually goes through. Keep the automation, keep the human gate. Best of both worlds.

The staged publishing approach mentioned in the article is a good step - at least you'd catch malicious code before it hits everyone's node_modules.


The time-based zoom interaction is clever - slowing down time as you zoom in makes the data feel more tangible. I've been working on a similar real-time globe visualization and learned a few things the hard way:

1. Throttling updates is critical. We went from per-event updates to 5-10 second batches and cut our WebSocket costs by 90%+ while the UX barely changed.

2. For the "ships crossing land" artifacts people are noticing - interpolating between sparse data points on a Mercator projection will always create these. On a globe (orthographic), great circle interpolation looks correct, but on flat maps you need to detect ocean crossings and handle them specially.

3. The biggest perf win was hybrid rendering: static heatmap for historical data + WebGL particles only for "live" movement. Trying to animate everything kills mobile.

Would love to see this with more recent data. The 2012 snapshot is fascinating but comparing pre/post-Suez blockage or COVID disruptions would be incredible.


been using a gl.inet axt1800 for travel and it's been amazing. mainly for hotel wifi where you can login once and all your devices connect automatically. curious how this compares - the unifi ecosystem integration could be nice but gl.inet is way more hackable


the timing of this with the epstein docs is pretty funny. honestly feels like someone did those redactions badly on purpose - anyone who works with pdfs knows you don't just draw black boxes over text. either massive incompetence or malicious compliance


bellard is basically proof that the "10x engineer" exists. ffmpeg, qemu, quickjs, and now this... all from one person. the fact that he can optimize a js engine down to 10kb of ram is wild. would love to see this on esp32 boards


What use-cases do you think it would help on esp32?


this is genuinely sad, groq had really fast inference and was a legit alternative architecture to nvidia's dominance. feels like we're watching consolidation kill innovation in real time. really hoping regulators actually look at this one but not holding my breath


Like the website ! I'd be nice to download directly from the website instead of being redirected to the github. Otherwise product works great!


Thanks!

I agree. that's next on my list. Wanted to get it out there early. Thanks for checking it out!


Hello KomoD, thank you for the reminder. I'll submit properly next time !


Hey! Maker here.

Context: I'm a solo dev who got frustrated with enterprise analytics tools. I was paying Mixpanel + GA4 subscriptions while my side project generated zero revenue. Classic indie hacker mistake.

The final straw was GA4's interface redesign. 2 hours to build a signup funnel. The 45kb script was killing my Lighthouse scores. And I was losing 15% of EU signups to cookie banners.

I wanted to build something that: 1. Looks beautiful (the 3D globe is surprisingly motivating to watch) 2. Answers questions in plain English instead of pivot tables 3. Doesn't cost more than my hosting ($9/month for both seemed fair) 4. Respects privacy without being preachy about it

Technical challenges: - Tracking script size: ended up using event batching + compression to hit 3.8kb - Globe performance: WebGL shader optimization to load in <3s - AI costs: smart routing between GPT-4o-mini and GPT-4 based on query complexity - Realtime scalability: connection pooling because 40 concurrent viewers = $$$ on Supabase

Happy to answer questions about the architecture, privacy approach, or why I chose Supabase over self-hosting!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: