Hacker Newsnew | past | comments | ask | show | jobs | submit | Mortiffer's commentslogin

The R community has been hard at work on small data. I still highly prefer working on on memory data in R dplyr DataTable are elegant and fast.

The CRan packages are all high quality if the maintainer stops responding to emails for 2 months your package is automatically removed. Most packages come from university Prof's that have been doing this their whole career.


A really big part of a in-memory dataframe centric workflow is how easy it is to do one step at a time and inspect the result.

With a database it is difficult to run a query, look at the result and then run a query on the result. To me, that is what is missing in replacing pandas/dplyr/polars with DuckDB.


I'm not sure I really follow, you can create new tables for any step if you want to do it entirely within the db, but you can also just run duckdb against your dataframes in memory.


In R, data sources, intermediate results, and final results are all dataframes (slight simplification). With DuckDB, to have the same consistency you need every layer and step to be a database table, not a data frame, which is awkward for the standard R user and use case.


You can also use duckplyr as a drop in replacing for dplyr. Automatically fails over to dplyr for unsupported behavior, and for most operations is notably faster.

Data.Table is competitive with DuckDb in many cases, though as a DuckDB enthusiast I hate to admit this. :)


You can, but then every step starts with a drop table if exists; insert into …


Or you nest your queries:

    select second from (select 42 as first, (select 69) as second);
Intermediate steps won't be stored but until queries take a while to execute it's a nice way to do step-wise extension of an analysis.

Edit: It's a rather neat and underestimated property of query results that you can query them in the next scope.


We all have different definitions on what is difficult. Maybe annoying or bothersome had been better words, but below beats nesting things:

    df |> select(..) |>
        filter(...) |>
        mutate(...) |>
        ...
And every time I've learned something about the intermediate result I can add another line, or save the result in a new variable and branch my exploration. And I can easily just highlight and run and number of of steps from step 1 onwards.

Even oldschool

    df2 <- df[...]
    df2 <- df2[...]
Gives me the same benefit.


Yeah, sure, I do a lot of such things in RAM in Elixir, some Lisp, PHP or, if I must, Python.

But sometimes I just happen to have just imported a data set in a SQL client or I'm hooked into a remote database where I don't have anything but the SQL client. When developing an involved analysis query nesting also comes in handy sometimes, e.g. to mock away a part of the full query.



Absolutely, if the engine has them and they're not wonky somehow.


Thinking the same. Also considering the amount of speed up you get from the copilot .


You can't have a bank account without a gov approved money manager account so you can't really self host it.

Also you can't on offramp crypto without a counter party will to make the trade ( liquidity )

Catchy title but I doubt it works was also vibecoded so I'd be extremely surprised if it works as advertised


Power is back on some parts of Lisbon


Why do they bring up vibe coding here. They are just a firebase alternative and Google has way superior ai code gen tools


Supabase is quite well integrated with vibe coding tools - cursor, replit, v0, etc. I agree that Firebase is a superior well-integrated product, but IMO the vibes are on Supabase side.


My guess is that they got that insane overvaluation because they sold themselves as an AI company


My humble guess is they sold themselves as the enabler for the AI vibe coding "revolution"


Sounds like sponsored content. Every other review I have read people say they go back to laptop because the text fidelity, eye strain and keyboard on lap is just the best productivity setup


I had trouble believing anything in the article since every sentence or 2 has a link to "the best laptop" or "the best powerbank". Just seems like a hub for a bunch of links to sponsored content.


Don't forget the link to the massively updated Best Chair post, loaded with affiliate links :)


I'm over 50 and need reading glasses as well as distance glasses. I actually find working in the Quest 3 better than a laptop in many ways. The balance betweeen (virtual) screen size and focussing distance seems to be easier to balance. With a laptop the distance sweet spot for vision isn't always the same as the comfort sweet spot for posture. I could probably optimize my desk setup to improve this - but the point of a laptop is freedom from being chained to a desk.

If I could get a remote keyboard/trackpad with a better range then I wouldn't need a laptop at all but currently I also use a laptop and Chrome Remote Desktop when I need text entry or a regular mouse.


Yeah, it's interesting because the focal distance in the Quest is a fixed ~5 ft, regardless of the apparent depth of the thing you're looking at in VR. This is a pretty comfortable spot for old eyes, so it can make things a little crisper compared to real life, at shorter and further apparent distances.

Though it comes with its own set of discomfort issues: https://en.m.wikipedia.org/wiki/Vergence-accommodation_confl...


Do you wear both VR and regular glasses? I'm not sure the VR set can accommodate cylinder and astigmatism of my prescription lenses.


I thought the same. Notice he doesn’t say it’s better than a laptop, only better than he expected. Then he goes on to explain what he doesn’t like about laptops generally, without explaining what he doesn’t like about this set up.


I really, really wanted the SimulaVR headset to work out because of th attention they were paying to text rendering. The hardware feels dead but the virtual desktop project might still have legs: https://github.com/SimulaVR/Simula

As far as eye strain goes, I think there's room for argument: having virtual screens cinema-screen-distance away from you is less straining than something under a meter away, but only if the text rendering is up to the job.


I don't know that the hardware is dead yet. They got a cash infusion last year and there are occasional hardware updates in their Discord. It's just a slow process with 1-2 engineers total working on the many different hardware and software and firmware elements of the overall product.


Yeah - last update on the web page was, what, December? I think they're going to get outrun by the rest of the market. "Walking dead", possibly. If I can get NUC+XReals+some sort of integrated desktop then they'd need something really compelling to make their headset worthwhile at the price they're aiming at.


I use a pair of Air Ones with prescription lens inserts and a DIY nose pad for comfort. I can't beat my desktop monitors for clarity, but it is fantastic if you have to read a lot of documentation and like distraction free environments. My job let's me book up my Samsung phone for basic access to documents, and I enjoy reading up on things as I get away from my desk for a change of pace. To say nothing of flying coach with my steam deck on a massive screen.


Laptops are pretty bad ergonomically, compared to a proper desktop setup. It’s true that current AR tech is even worse for most.


I came to the same conclusion and moved on. We had some c# applications reading some python


I have not found a single prof that thinks the trend have having higher and higher % of university staff be administrators is a good thing.

I wonder if it is possible for them to connect funding to a maximum allowed ratio of admin to prof / lectures


Love how a competitor to git was published on GitHub


How can we get ride of vendor lock-in and have fait market competition get prices down for cloud?

It must be possible to make cloud more cost effective via specialization versus every company building the same infrastructure again and again.


Proposed solution: A set of neutral validators that define standard Interfaces and then test any cloud wanting to get listed for compatibility and minimum included performance (also egress).

If all this data is open we should get competition back and fix cloud.

Disclaimer: I am working on suh a system, enterprise love the idea it does well at hackathons but not production ready on the validation standard yet. Would be happy to get critical HN feedback.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: