From my perspective SVN merges were absolute ape show even with as small as 5 people in the team.
Internet down - cannot do anything because you cannot even commit stuff with subversion. Let alone some totally obscure version control software I was working with as well.
Yes it was better than making copies of source code with date/time.
What is my killer feature for GIT: that I can work on my local repo and I can do whatever I want there. Only when I have to share the code I have to cleanup commits/code.
In general I probably could do the local repo with SVN and moving changes between them would be more hassle than worth probably. Also as young dev I did not thought about that until I saw workflows in GIT and it blew my mind.
> Internet down - cannot do anything because you cannot even commit stuff with subversion.
That, in itself, I consider a poor argument. The obvious solution would be to ensure Internet does not go down.
However, as an important design precondition, it forces the builder of such a system to embrace asynchronous, eventual consistency etc. Which leads to a much better design, even if your internet is 99.9999% available.
I'm convinced the "offline" requirements are the reason merging, rebasing, etc are so well done in Git.
No, my argument is that "ensureing connectivity" is a much simpler, cheaper and easier solution than "build all tools so they can handle offline".
Or: fine: git can be down. Now, how to read the framwork/api/lib documentation. Need to ask a colleague where that key for the CI was again: "build offlinefirst messaging". Need that backtrace from the CI the last time it ran "build some auto-asset-synching with the CI to local" and so on.
If you think that is a valid line of solving things, then something is wrong with the way you are solving problems.
No, my argument is that "ensureing connectivity" is a much simpler, cheaper and easier solution than "build all tools so they can handle offline".
The only problem with this is that ensuring connectivity is impossible. You can have multiple redundant backups with different technologies, and there's always a non-zero possibility that all of them will fail at once. This is compounded by factors like connectivity on the other end - how can you ensure connectivity when someone is working from a hotel for a conference, or from home, or from their yacht? Having a centralized repo means you also need people to work from their office if you're not going to control their connectivity too.
When it comes to source control, something that lets developers carry on working when they don't have access to a central repo is massively better than everything else.
Now, how to read the framwork/api/lib documentation.
It's in the repo, so ... just read it like normal because you have a local copy?
"Well done" might be overstating things... they're better than other tools alright, but if I only had a dollar for every single time I saw a poor git diff. Forget language-aware diffing; the line-level diffing doesn't even seem to handle even the dumbest cases of indentation and brace-matching with any intelligence when merging. They have a lot of low-hanging fruit for improvement.
Internet down - cannot do anything because you cannot even commit stuff with subversion. Let alone some totally obscure version control software I was working with as well.
Yes it was better than making copies of source code with date/time.
What is my killer feature for GIT: that I can work on my local repo and I can do whatever I want there. Only when I have to share the code I have to cleanup commits/code.
In general I probably could do the local repo with SVN and moving changes between them would be more hassle than worth probably. Also as young dev I did not thought about that until I saw workflows in GIT and it blew my mind.