BASIC 1.77 ARMv8 AArch32 opcodes
Pages: 1 2
Steve Fryatt (216) 2105 posts |
Maybe. Git does seem to be a lot better at handling parallel development, which is fairly common in many Open Source projects, than systems like CVS and Subversion. Certainly I’ve found, in my recent forays into it, that it’s able to make sense of the kind of “fixing bugs in a stable release whilst doing major work on a couple of branches” that would have Subversion choking very quickly. The NetSurf devs cited similar reasons when they migrated from Subversion several years ago. The problem, as Rick says, is that it’s fiendishly complex and hence difficult to master. That XKCD was my memory of working on NetSurf post Git, which was one reason why I began to focus my attention back towards my own projects. But, as I might have said earlier on the thread, armed with Tortoise Git on Windows, I’m starting to warm to the things that it seems to be able to do… even if I’ve not quite worked out how it does them, or what might cause it to stop working. And there’s no denying ROOL’s other point: Git is currently where it’s at WRT version control. Many (most?) popular communal online repositories are Git-based now, and even Subversion is becoming rarer now. People do use it, and will be familiar with it. |
Steffen Huber (91) 1953 posts |
When we migrated from CVS to Git at work, we introduced the role of a “Git Hero” for every team and two “Git Super Heroes” for the whole development department (and I was neither role). The problem with Git is mostly its complete user unfriendlyness coupled with not-very-good graphical UIs (although they got a lot better over the last few years!), so getting into trouble is very easy once you leave the happy path. Despite its hideous command line interface which makes standard UNIX stuff looking very user friendly, Git is now ubiquitous, and adopted everywhere. Deal with it. Both CVS and SVN have their own share of problems. Just try to rename a file in CVS…or do a lot of branching in SVN… |
Steve Pampling (1551) 8170 posts |
Hmmm, there’s a common comment about 10 million flies that’s going round in my mind. Can we do an easier sales job and promote vi as a beautiful, flexible and friendly text editor? |
Steve Fryatt (216) 2105 posts |
You seem to have selectively ignored Steffen and my comments about the benefits. Git may be complex, but it is a lot more capable in some key areas than Svn or CVS. It’s almost as if it was designed to avoid the known pitfalls of existing VCS — and doing version control right is far from a trivial thing. Also, I don’t think a project migrating from CVS is in any position to talk about user familiarity or ease of use. :-) |
Stuart Painting (5389) 714 posts |
Thanks for that. I think I was thrown by an earlier comment about an event being “orphaned” when merging a side branch to the branch master, which led me to suppose that the hashes referred to individual code changes in isolation rather than being a pointer to the entire state of the branch immediately after the change was applied.
This strikes me as the better option. The information needs to be stored on the server, rather than appearing as something of an afterthought in the zip file. There’s a reasonable chance that the user would be able to determine the build date of a component, but wanting the user to supply the corresponding list of hashes (or expecting the developers to hold local copies of every build) is asking for trouble. |
Rick Murray (539) 13840 posts |
The ExplainXKCD article describes git as… Git is a version control system, used to manage the code in many millions of software projects. It is very powerful, and was amongst the first widely adopted tools to use a distributed version control model (the “beautiful graph theory tree model”), meaning that there is no single central repository of code. Instead, users share code back and forth to synchronise their repositories, and it is up to each project to define processes and procedures for managing the flow of changes into a stable software product.(my emphasis) Uh… ? |
Jeffrey Lee (213) 6048 posts |
(my emphasis) The common workflow is to have a remote server (github, gitlab, etc.) which acts as the central repository of code. But the underlying principle of “there is no single central repository” holds true. In CVS “cvs checkout” will grab a snapshot of the source code at a specific point in time. That’s all you get – one version of the source, with the rest still hidden away on the central server. With git, “git clone” grabs a complete copy of the repository – the current state, and its full history. This means that if you’re on a big project which is being worked on by thousands of people (like the Linux kernel), your central server isn’t constantly overworked trying to keep up with all of the changes people are making. Teams can fork the repository to their own server and use that for day-to-day development, only syncing with the central server every few days or weeks. And individuals can spend weeks or months working on changes locally, making (and occasionally throwing away) hundreds of commits without pushing a thing to the team server or the central server. I think it’s also possible for individuals to share changes in a peer-to-peer manner, without requiring them to set up a git server (although I’m not sure if there are many user-friendly tools for this) |
Rick Murray (539) 13840 posts |
Better for parallel development, CVS sucks at renaming, and “everybody uses it” (but so few people know how it actually works that those few who do are designated the appellation “hero”!). The model is overly complicated. I’ve read several overviews and, sorry, can’t get my head around it. You need to know a few things, namely project → release → component → file → version. These files were these versions on these dates. Isn’t anything more than that just adding complication? The command line is insane. I suppose it says a lot that repo corruption is just “something one has to deal with at some stage”. Everybody screws up (we’re human, it happens) so it’d be nice if the source versioning system didn’t add to the misery… Which reminds me, it seems so many blogs talk about ways to fix git because the documentation is written for people who already know how to use it… And of course with a whole pile of extra terminology like reflog. Is that ref-log or re-flog? ;-) Now jumping back a paragraph – one of the fundamental problems I have with git is how easily it seems to mess itself up. The golden promise of a source version control system is that code, when pushed, is safe. It’s a versioning system, it tracks modifications from update to update. That’s it’s raison d’être. Even if you go mad and mess up a bunch of files, it’s possible to pull copies from before the error or to revert. Git… Doesn’t appear to be safe if you do something daft before issuing a rebase (look, more terminology) command. That’s not to say that the idea is bad, it’s the implementation. And that everybody uses it means nothing. It seems like everybody is on Facebook these days. Need I say more? :-) |
Steve Pampling (1551) 8170 posts |
I was going to say that having re-read Steffens comments I can find very little that counts as positive but Rick commented and I think Ricks summary covers it if you note that the “everyone uses it” I covered with the 10 million flies quip1- something that Rick goes on to cover with a Facebook example. 1 The thing is that at work I have a steady stream of third party providers saying they are allowed to do (some sloppy/dodgy practice) at xyz. Just because others do something it of itself does not provide evidence that it is a good idea to do it yourself. |
Steve Pampling (1551) 8170 posts |
Prime example of something I don’t do along with, I think, most1 of the techies in our department thus proving that the “everybody” used in the example is clearly just a limited subset of the real everybody. 1 It has to be “most” since I haven’t outright asked all of them but those who have stated they don’t are definitely a majority approaching a totality. |
Rick Murray (539) 13840 posts |
I think you guys are the limited subset. I set up an account ages ago as many people at work just expected me to be there. Not logged in in ages, never saw the point beyond sharing photos of kids doing dumb things and the adults doing dumber things. It’s something I tried and thought “meh”. Like Second Life. Tried that, oh, once. Shrugged. I’m just not a particularly sociable person. God1 help us all. 1 Alternative mythological entities are available. |
Rick Murray (539) 13840 posts |
Just out of interest – what happens when you have a huge project (like the sort of stuff you get on Linux) but everybody running their own local incarnations – and multiple people modify the same file? Surely the moment one is synced with the central server, all the others are automatically out of date? Also, how is this that different to CVS? One can download the various sources, use a local copy, work with that, then sync up with the master server…?
Is there no way to retrieve all? |
Jeffrey Lee (213) 6048 posts |
Git doesn’t mess itself up. The problem is that it gets messed up because people don’t know how to use it (because of poor UI/documentation, etc.) – or don’t know how to properly fix mistakes once they’ve made them. Git may be difficult to use, but I’d argue that it gives people more opportunities to correct their mistakes than a traditional centralised version control system would do. If you mess up a merge in git, you can delete it from your local repository and try again. Good luck doing that with CVS, SVN, Perforce, etc.
Yes. When you pull changes from a remote server, they’ll get merged with any changes you’ve got in your local repository. So if two people have modified the same code then there’ll probably be a merge conflict which will need resolving. It’s no different to any other VCS in that regard (although newer VCS’s generally have cleverer merge engines than older ones).
With CVS you have no local history. For a version control system, that’s pretty bad. If you’re working on a feature for days or weeks, and you discover that you’ve broken something, you’ve got no local history to fall back on to find when/where you broke it (or what the good version of the code was like). If you get latest and merge some remote changes into the code you’ve been working on locally, and you mess up that merge, you’re fucked. OK, CVS does make backups of files before merging, but if you’re dealing with big or complex merges, you’ll more often than not need to look at a few different versions of the code in different places before you’re able to work out the correct way of proceeding. With git you can easily look at the history of both your changes and the remote changes to work out the correct course of action (or to recover relevant bits of code you’ve broken).
Exactly. And you said git has a bad CLI? ;-) |
Steve Pampling (1551) 8170 posts |
I was thinking of testing a first life, but apparently as a support techie I’m not supposed to have one. ;-)
Possibly Instagram. I’ve no knowledge of its content but it’s on the block list at work.
|
Steve Fryatt (216) 2105 posts |
Yes, exactly like any other VCS. Please don’t tell me that you’ve never started to make changes to a collaborative project, and then realised that you haven’t done an In Svn, incidentally, that can be incredibly painful to sort out. Been there and done that…
Git works completely offline, with all of the history locally on your own machine. You can make multiple local commits, without bothering the main project. You can even create your own branches to work on things without pushing them back to the server, so long as you merge them into a branch which is on the server at some stage if you want others to see them. In case it’s not obvious, I’m in the slightly odd position of still disliking Git, whilst at the same time being able to see many reasons why it would be better for me to use it for my RISC OS development than to stick with Svn. I suspect that it could be a matter of time, especially if I become more familiar with Git: I’m increasingly doing things with it and thinking “I wish Svn did that”. |
Chris Mahoney (1684) 2165 posts |
I don’t have an account, but I get the odd link to it. On a desktop you can see comments on each photo, but on a phone you now need an account before they’ll show up (this is a recent change). And they’ve forced two-factor authentication so you can’t even use Bugmenot! |
Steffen Huber (91) 1953 posts |
I am using Git now for many years, and never had a corrupted repo. However, in CVS times, there once was a major incident of server corruption, in addition to the usual “oh, I accidentially moved the branch label” fun and stuff. I think Git suffers – on the whole – from its immense flexibility. You need to settle on a standard workflow, and use a sensible tool set – then, everything can be kept under control. For sure neither SVN nor CVS are credible alternatives nowadays. And the rest of the alternatives mainly suffer from a less-than-complete ecosystem, like Mercurial or even Monotone or Bazaar or GNU Arch or Fossil or Darcs. Yes, we evaluated all of them. I think apart from Mercurial, nothing really got traction in the real world, and GitHub and GitLab play no small role here. It’s all about the ecosystem – even Microsoft moved TFS onto Git. Rest assured that the commercial stuff is also really really bad. Perforce, VSS, PVCS, Team Concert, Endevor, ClearCase. And having said that using the Git command line is a torture, I actually seldomly use it. Eclipse Git client all the way. Works well enough. I think that “version control system” is just another complete IT failure, similar to “issue tracking system”. So many to choose from, but none in sight that feels “nice”. |
Pages: 1 2