I just joined a new team (very small: four developers in total, with two of those leaving soon). The two original developers set up the git repo on a folder in a Windows network share.
Am I taking crazy pills, or is that a bad idea? Our organization does have github/gitlab/bitbucket available, so is there any good reason not to use those hosted solutions?
Are you concerned about corruption due to multiple users? Are you using the repo in the intended way? Then it’s fine. Git has locking mechanisms. Pull, work, commit, push.
I can’t exactly put my finger on it, but something feels off. For example, on my first day, I wasn’t able to view the files in Windows Explorer (or clone the repo, actually), so the other dev just gave me a zip file of the repo. There’s something fishy going on, and I’m trying to figure it out.
Since it’s on a network share, there’s the extra overhead of managing the file system permissions. And you probably hadn’t received access at the point.
That probably is the case, but in my mind I’m also questioning if they’re backing it up regularly, what prevents someone from going in and deleting the files, etc.
To be honest I’d start by asking them why it’s set up like that as diplomatically as possible. This might be a bad solution, but what pushed them to adopt it nevertheless might be an organizational peculiarity you don’t want to find the hard way.
Do they have an agreement with GitHub/gitlab/bitbucket? Using their consumer targeted services as a business is just asking for trouble.
Storing a headless repository on a shared filesystem is a perfectly valid solution, and something git was designed to do, if you don’t mind the lack of online interface. Although I’d personally prefer using an ssh connection.
This is actually a very large government agency, with many internal as well as external projects hosted on those services, in the public instances as well as our own internal hosted instances of those services. But as long as there’s no glaring issues with it, and it’s a generally acceptable practice, then I’m fine with it as it doesn’t really affect my day to day use via command line.
You can run a private instance of GitLab without issues. Also, there is more OpenSource and free git repository services beyond GitHub and Gitlab, that you can install and run locally in a small server.
Still a better version control than 20210131_v0.2_test2_john
Do you mean a bare repo you use as a remote and push/pull from or using the workdir straight from the share. The first would be strange, but kinda usable (don’t do this though), the latter is oh my god, get an other job category.
Yes, it’s definitely the former case, thankfully. Agreed that it’s strange, but it’s hard to put a technical reason behind it if I decide to push for hosting it somewhere better.
Working from the network share - I’ve worked on a project like this before, it was awful for developer experience. It took seconds to run a git status
, it was so slow. Occasionally git would lock itself out of being able to update files, since the network read/write times were so slow. Large merges were impossible.
The reason it was set up like this was that the CEO had the network share we were working off of set up to serve to a subdomain (so like, Bob’s work would be at bob.example.com), and it would update live as most hot-reloads will do. He wanted to be able to spy on developers and see what they were doing at any given time.
I have a lot of programming horror stories from that job. The title of this thread brought up a lot of repressed memories.