Should be doable on consumer hardware nowadays, if you cheat by using a file system that either supports sparse files (https://en.wikipedia.org/wiki/Sparse_file) or block-level deduplication (https://en.wikipedia.org/wiki/Data_deduplication). You may need to use raw block I/O to create such file, and there will be lots of duplicated content in the archive.
Also: how hard is that limit? ZIP archives have their TOC at the end of the file and allow for inserting ‘junk’ that is never referenced in the ZIP’s table of contents. Isn’t it possible to add such junk to make an archive go over that limit (assuming that your file system allows files larger than 2⁶⁴ bytes)?
The problem is once you zip them to full compression, you really can't use them ever again. That is unless you get the good ones that let you technically unzip without requiring destruction.
There is a whole lot of brochure type web work that has disappeared, either to these site builders or Facebook. I don't know what happened to the people doing that sort of work but I would assume most weren't ready to write large React apps.
Why are you assuming that? How do you think all the new React jobs were filled? React developers don’t magically spring into existence with a full understanding of React out of nowhere, they grow into the job.
Sure it does. It’s not a guarantee, but presuming that a pattern is likely to continue is not nothing. When a pattern is observed, the onus is on the “This time is different!” side to make their case.
If you don’t accept that an observed trend says anything about the future, you shouldn’t make unsupported assertions in the opposite direction. They say less.
AI aiming to automate everything is something new. That's the point. There was no AI in the past similar to what is slowly unfolding now. Not even close. If you disagree with the word "anything" i used, then yes i understand i shouldn't have used this word.
Speed has been the consistent thing I've noticed with Gemini too, even going back to the earlier days when Gemini was a bit of a laughing stock. Gemini is fast
I don't know exactly the speed/quality tradeoff but I'll tell you this: Google may be erring too much on the speed side. It's fast but junk. I suspect a lot of people try it then bounce off back to Midjourney, like I did.
"Now that you've been promoted, you don't build CRUD tools anymore. Those are below your level. Instead, you build AI agents that build the CRUD tools."
Or should this be more specific in that you shouldn't shorten URLs using the provider's domain name, but bring your own domain. So if the provider goes away, you can in some way migrate the links.
That's why I set up my own thing.
I don't care about analytics at all, so I just wrote a simple build system doto generate some very basic HTML redirects.
As a hypertext purist, I used to think this. But working for a large organization, shortlinks can be an invaluable way to actually maintain the integrity of links that have been deployed to unrevisable media (emails, print, pdfs, etc).
If a resource has been relocated off of a host/url, often part of that situation is that we don't have immediate access to implement a redirect from that host to the resource's new location.
Now I see a shortlink manager as a centralized redirect manager, which is so much more rational and stable than creating a tangle of redirect config across dozens of hosts or hundreds of content applications.
The caveat is that you don't need to use a 3rd party domain or service, you should definitely at least use your own domain. You also don't need to make them unreadable hashes, they can actually be more human-friendly.