Hacker Newsnew | past | comments | ask | show | jobs | submit | rockinghigh's commentslogin

It's a lot simpler. These models are not optimized for ambiguous riddles.

There's nothing ambiguous about this question[1][2]. The tool simply gives different responses at random.

And why should a "superintelligent" tool need to be optimized for riddles to begin with? Do humans need to be trained on specific riddles to answer them correctly?

[1]: https://news.ycombinator.com/item?id=47054076

[2]: https://news.ycombinator.com/item?id=47037125


How is this riddle relevant to a coding model?

It's not a coding model. Go to https://chat.z.ai/ and you'll see it is presented as a generalist.

They do. Pretty much all agentic models call linting, compiling and testing tools as part of their flow.

It's called problem decomposition and agentic coding systems do some of this by themselves now: generate a plan, break the tasks into subgoals, implement first subgoal, test if it works, continue.


That's nice if it works, but why not look at the plan yourself before you let the AI have its go at it? Especially for more complex work where fiddly details can be highly relevant. AI is no good at dealing with fiddly.


That's what you can do. Tell the AI to make a plan in an MD file, review and edit it, and then tell another AI to execute the plan. If the plan is too long, split it into steps.


This has been a well integrated feature in cursor for six months.

As a rule of thumb, almost every solution you come up with after thirty seconds of thought for a online discussion, has been considered by people doing the same thing for a living.


That's exactly what Claude does. It makes a comprehensive plan broken into phases.


There’s nothing stopping you from reviewing the plan or even changing it yourself. In the setup I use the plan is just a markdown file that’s broken apart and used as the prompt.


A language model in computer science is a model that predicts the probability of a sentence or a word given a sentence. This definition predates LLMs.


A 'language model' only has meaning in so far as it tells you this thing 'predicts natural language sequences'. It does not tell you how these sequences are being predicted or any anything about what's going on inside, so all the extra meaning OP is trying to place by calling them Language Models is well...misplaced. That's the point I was trying to make.


How do you join two datasets using r-trees? In a business setting, having a static and constant projection is critical. As long as you agree on zoom level, joining two datasets with S2 and H3 is really easy.


Spatial indices simply partition your data in N-dimensional space the same way a binary tree does it in 1-dimensional space.

The whole advantage over a static partition is that it will allow you to properly deal with data that is irregularly distributed.

Those data structures can definitely be merged if that's what you're asking.


This data is indeed not irregularly distributed, in fact the fun thing about geospatial data is that you always know the maximum extent of it.

About your binary tree comment: yes this is absolutely valid, but consider then that binary trees also are a bad fit for distributed computing, where data is often partitioned at the top level (making it no longer a binary tree but a set of binary trees) and cross-node joins are expensive.


I wouldn't say R-trees solve the problem better. Joining multiple spatial dataset indexed with r-trees is more complex as the nodes are dynamic and data dependent. Neighborhood search is also more complicated because parent nodes overlap.


Its a well researched area. My understanding is for most use cases and data like this R trees outperform as bounding box comparisons are fast to run and the bounding boxes tend to be well organised to chunk data efficiently. H3 is a looser area and you may find lots of your points are clustered in a few grids so you end up doing more expensive detailed intersection calculations. Of course it all depends a little on your data, use case and to some extent the parameters chosen for the spatial index. But I think safe to say now based on industry experience that r trees do a very good job 99.9% of the time.

You can of course also use h3 in postgis directly as well as r trees. Its helps significantly for heatmap creation and sometimes for neighbourhood searches.


That's not true when tiling the Earth though. You need 12 pentagons to close the shape on every zoom level, you can't tile the Earth with just hexagons. That's also why footballs stitch together pentagons and hexagons.


Instagram uses it as their main backend. They have hundreds of million of daily users. Some of the critical backend services are in C++.


Depends on your definition of "use." They use an internal async fork, and don't use the ORM: https://djangochat.com/episodes/django-instagram-carl-meyer


Tariffs were a huge point of debate in his first administration. The government had to pay $30 billion to farmers to offset the impact of tariffs.

> China implemented retaliatory tariffs equivalent to the $34 billion tariff imposed on it by the U.S. In July 2018, the Trump administration announced it would use a Great Depression-era program, the Commodity Credit Corporation (CCC), to pay farmers up to $12 billion, increasing the transfers to farmers to $ 28 billion in May 2019. The USDA estimated that aid payments constituted more than one-third of total farm income in 2019 and 2020.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: