Hacker Newsnew | past | comments | ask | show | jobs | submit | plaidfuji's commentslogin

Nicer to ride in in what sense? I find single-pedal to be nauseating because many drivers can’t control their foot raise well enough for smooth or gentle braking, and the suspension feels chunky as hell, most likely due to how heavy the car is. But maybe that’s just Teslas. Their ride is just categorically worse than most ICE vehicles.

I like this viewpoint - it basically casts VC-backed AI startups as privately-subsidized applied R&D projects, which largely seems to be the case for foundational model companies.


… or, let’s see humans who are now 10-100x more productive (due to automation of mundane tasks that are already part of the training data) do the things you’re asking for.


Likely analog to digital converter, digitizing the raw signal from the photodetector cells


It is annoying though, when you start a new chat for each topic you tend to have to re-write context a lot. I use Gemini 3, which I understand doesn’t have as good of a memory system as OpenAI. Even on single-file programming stuff, after a few rounds of iteration I tend to get to its context limit (the thinking model). Either because the answers degrade or it just throws the “oops something went wrong” error. Ok, time to restart from scratch and paste in the latest iteration.

I don’t understand how agentic IDEs handle this either. Or maybe it’s easier - it just resends the entire codebase every time. But where to cut the chat history? It feels to me like every time you re-prompt a convo, it should first tell itself to summarize the existing context as bullets as its internal prompt rather than re-sending the entire context.


Agentic IDEs/extensions usually continue the conversation until the context gets close to 80% full, then do the compacting. With both Codex and Claude Code you can actually observe that happening.

That said I find that in practice, Codex performance degrades significantly long before it comes to the point of automated compaction - and AFAIK there's no way to trigger it manually. Claude, on the other hand, has a command for to force compacting, but at the same time I rarely use it because it's so good at managing it by itself.

As far as multiple conversations, you can tell the model to update AGENTS.md (or CLAUDE.md or whatever is in their context by default) with things it needs to remember.


Codex has `/compact`


> The “Set up billing” link kicked me out of Google AI Studio and into Google Cloud Console, and my heart sank. Every time I’ve logged into Google Cloud Console or AWS, I’ve wasted hours upon hours reading outdated documentation, gazing in despair at graphs that make no sense, going around in circles from dashboard to dashboard, and feeling a strong desire to attain freedom from this mortal coil.

100% agree


we will have this fixed soon : ) thank you for the patience, have wanted this in AI Studio directly since the day I joined Google!


Reasoning (3 Pro) >> Flash, and I assume their overviews are generated by Flash. But I haven’t found those to be that bad, myself.


Now have it generate the articles and comments, too…


The moat will be memory.

As a regular user, it becomes increasingly frustrating to have to remind each new chat “I’m working on this problem and here’s the relevant context”.

GenAI providers will solve this, and it will make the UX much, much smoother. Then they will make it very hard to export that memory/context.

If you’re using a free tier I assume you’re not using reasoning models extensively, so you wouldn’t necessarily see how big of a benefit this could be.


Python is a pretty bad language for tabular data analysis and plotting, which seems to be the actual topic of this post. R is certainly better, hell Tableau, Matlab, JMP, Prism and even Excel are all better in many cases. Pandas+seaborn has done a lot, but seaborn still has frustrating limits. And pandas is essentially a separate programming language.

If your data is already in a table, and you’re using Python, you’re doing it because you want to learn Python for your next job. Not because it’s the best tool for your current job. The one thing Python has on all those other options is $$$. You will be far more employable than if you stick to R.

And the reason for that is because Python is one of the best languages for data and ML engineering, which is about 80% of what a data science job actually entails.


> And pandas is essentially a separate programming language.

I'd say dplyr/tidyverse is a lot more a separate programming language to R than pandas is to Python.


> And pandas is essentially a separate programming language.

No it isn't.


...unless your data engineering job happens on a database, in which case R's dbplyr is far better than anything Python has to offer.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: