Hacker Newsnew | past | comments | ask | show | jobs | submit | jumperabg's commentslogin

If used can we hallucinate and predict the HN news of tomorrow(especially any acquisition related news)?

yes,but you wont care.

So if I workout every day, some days do strength training others long runs, cardio, sprints, strides my heart falure risk is higher with the usage of melatonin pills?

Also in the article it mentions that they focus only on groups that have already been prescribed the medication or have diagnosed insomnia or other heart related issues so if you are fully healthy and take it from time to time to recover sleep or change a timezone(when travelling) it might be okay for single time use(I am not a doctor).


and AI snakes


What if his wife is AI? Are we swiping right on Grok 4?


Maybe Gemini CLI achieved AGI?


... is artificial leafs that create liquid fuel for fusion reactors.


Why would you run your site on Workers instead of the static content hosting? Aren't the workers supposed to be used in case you must do computational work for requests, connect to a db, do some work etc... ?


I got curious about it too, found that author actually did a write-up on that in 2020 [0]. I don't know that much about Workers, but it sounds like it's needed for analytics?

[0]: https://web.archive.org/web/20250328111057/https://endler.de...


Author here. That's the answer.

Or at least it used to be the answer when I still cared about analytics. Nowadays, friends send me a message when they find my stuff on social media, but I long stopped caring about karma points. This isn't me humblebragging, but just getting older.

The longer answer is that I got curious about Cloudflare workers when they got announced. I wanted to run some Rust on the edge! Turns out I never got around to doing anything useful with it and later was too busy to move the site back to GH pages. Also, Cloudflare workers is free for 100k requests, which gave me some headroom. (Although I lately get closer to that ceiling during good, "non-frontpage" days, because of all the extra bot traffic and my RSS feed...)

But of course, the HN crowd just saw that the site was down and assumed incompetence. ;) I bury this comment here in the hope that only the people who care to hear the real story will find it. You're one of them because you did your own research. This already sets you apart from the rest.


Using Workers is now what Cloudflare recommends by default, with "Static Assets" to host all the static content for your website. Pages, as I understand, is already built on the Workers platform, so it's all just simplifying the DX for Cloudflare's platform and giving more options to choose what rendering strategy you use for your website.


Sometimes people use them as reverse proxies or to load stuff from cloud KV. You could probably build a web stack on it if you wanted to. But I think a static blog like this would be simple enough on Pages or behind the regular CDN.


Amazing this is another new type of antibiotic that was discovered maybe for the last 6 months?


Interesting, could it be that their software is built by Gemini, the acquisition is managed by Gemini, and the Gemini in Google made a $32B deal with the Gemini at Wiz?


This is quite a lot of code to handle in 1 file. The recommendation is actually good in the past(month - feels like 1 year of planning) I've made similar mistakes with tens of projects - having files larger than 500-600 lines of code - Claude was removing some of the code and I didn't have coverage on some of them and the end result was missing functionality.

Good thing that we can use .cursorrules so this is something that partially will improve my experience - until a random company releases the best AI coding model that runs on a Rassbery Pi with 4GB ram(yes this is a spoiler from the future).


> I've made similar mistakes with tens of projects - having files larger than 500-600 lines of code

Is it a mistake though ? Some of the best codebase I worked on were a few files with up to a few thousands LoC. Some of the worst were the opposite, thousands of files with less than a few hundred LoC in each of them. With the tool that I use, I often find navigating and reading through a big file much simpler than having to have 20 files open to get the full picture of what I am working on.

At the end of the day, it is a personal choice. But if we have to choose something we find inconvenient just to be able to fit in the context window of an LLM, then I think we are doing things backward.


Claude seems to be somewhat OK with 1500 LOC in one file. It may miss something, mess something up, sure, that is why you should chunk it up.


I'm using Cursor & Claude/R1 on a file with 5000 loc, seems to cope OK


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: