It isn't THAT bad. Investors definitely care more about model and growth potential. But you will certainly run into those who ask "How are you integrating AI into your platform?" without any real idea about what that means.
"Normalization was built for a world with very different assumptions. In the data centers of the 1980s, storage was at a premium and compute was relatively cheap."
But forget to do normalisation and you will be paying 5 figures a month on your AWS RDS server.
"Storage is cheap as can be, while compute is at a premium."
This person fundamentally does not understand databases. Compute has almost nothing to do with the data layer - or at least, if your DB is maxing on CPU, then something is wrong like a missing index. And for storage, its not like you are just keeping old movies on your old hard disk - you are actively accessing that data.
It would be more correct to say: Disk storage is cheap, but SDRAM cache is x1000 more expensive.
The main issue with databases is IO and the more data you have to read, process and keep in cache, the slower your database becomes. Relational or non-relation still follows these rules of physics.
> This person fundamentally does not understand databases.
Oh boy I do love hackernews :).
It sounds like you’ve spent a lot of your career in a SQL world. Have you worked a lot with DDB/MongoDB/Cassandra? If not then give it a whirl with more than a toy application and share your thoughts. Already done that? Try the brand new “constructive criticism” framework.
Instead of “this person fundamentally does not understand databases” based on 13 words in a 1200+ word article, consider: “I disagree with this statement and here’s why”.
You get all of the karma with none of the ad hominem! Win win!
Location: UK, Horsham
Remote: Yes
Willing to relocate: No
Technologies: Data Engineering, Data Architecture, Data Performance Engineering, Java, Python, SQL.
Résumé/CV: https://www.linkedin.com/in/jonathanlevin/
Email: mail@jonathanlevin.co.uk
This is obvious to me. Since Hadoop came out, (a lot of) people have been giving up on even forming algorithms and just dumping data into machine learning and hoping for the best. I recall someone high up at Google complaining about it.
We need to get back to forming algorithms as well as concepts and first principles. We cannot and should not expect ML to brute force finding patterns and just sit back and relax.
Here is another prediction for you: we will not solve ray-tracing in games and movie CGI with more hardware. We will need some algorithm that gets us 80-90% of the way there in a smart way.
This was my first thought. Well, to be more complete - smart algorithms beat dumb algorithms even if the dumb algorithms use hardware acceleration (unless the problem is trivial anyway.) Smart algorithms plus hardware acceleration beats smart algorithms on general purpose hardware. Smart algorithms are just better.