So, we write classes so code can be reusable. Then we can use, say, class "Foo" in multiple projects.

Now, say I have used "Foo" on a web application, and also in a related web application, and they both live on the same hardware.

Application.php

require_once("classes/foo.php");

SomethingElse.php

require_once($path_to_application_php . "/classes/foo.php");

I think this is, in a way, BETTER than SomethingElse having its own copy of Foo. When I must make changes to Foo the new functionality is available for both apps without me having to do much work. However, what if we decide to move SomethingElse to a different server?

I've thought about remote mounts, but that's a big PITA.

Is is possible with Git (for example) to push class files like this from a central repository to multiple other locations? I think I MIGHT have a strategy ... but are there multiple ways to accomplish this goal? How do you do it, or would you do it ... or would you NOT do it and do something else?

    What I'm wanting to avoid is keeping multiple copies of class Foo and having to edit each of them when we think of an improvement to the class' methods, etc.

      My first thought would be to put a related set of classes together into their own Git repo, and then add that as a Git submodule in any other repo that wants to use them. We ended up doing this for a couple of projects I work on.) You might want to namespace it, as well? (I've not really used PHP namespacing much, so I'm not sure how beneficial/annoying that would be.)

      Another alternative might be to leverage PHP Composer, I suppose?

      NogDog "Git submodule" ...

      Excellent! I'll take a look at this very soon. 🙂

        You might also want to consider that sometimes, you have this brilliant new idea for functionality, but it turns out to be backward incompatible. You could change your current project to accommodate this new feature, but you may not have the time to make the corresponding changes to older projects that rely on this module, yet you (or others using your projects) might have new deployments of these older projects, so they cannot just use the head of the master/release branch of the repository. Therefore, some release versioning is required, which is also typically supported in version control systems through things like release version tagging.

        laserlight Therefore, some release versioning is required, which is also typically supported in version control systems through things like release version tagging.

        Not saying that's a bad thing, but the peril of letting it proliferate unchecked is: a dependency hell where every single project ends up needing its own specific versions of every other specific package, leaving you with no benefit to having shared libraries.

        In the end the only sane way to deploy an application is to have everything from the OS up hand-picked for compatibility, installed, and deployed as a VM.

          Or you follow the concept for library classes of never changing existing methods' functionality; instead only adding new methods when you need different functionality (hopefully named in a way that makes it clear what the difference is).

            Weedpacket wrote:

            In the end the only sane way to deploy an application is to have everything from the OS up hand-picked for compatibility, installed, and deployed as a VM.

            Actually, that's the stance taken by the Docker folks, except that they don't go as far as to deploy as a VM, but as a "container".

            NogDog wrote:

            Or you follow the concept for library classes of never changing existing methods' functionality; instead only adding new methods when you need different functionality (hopefully named in a way that makes it clear what the difference is).

            That is a recipe for bloated core interfaces such that a small change to the internals of the class could necessitate a huge amount of reworking of existing methods. What you should do instead is to have free functions extend the interface, but that only works if the core interface supports what you want to do, but it might not.

              Docker is pretty awesome -- our team at work uses it for everything we work on now. 🙂

                Well, folks, this is pretty much exactly what I was hoping for ... some discussion of the merits and demerits of various approaches to the issue. 🙂 What follows is free-form associational thinking and isn't directed at anyone in particular and isn't meant to be critical.

                "Team" --- we've not got one. At least, not where Programming & DevOps are concerned.

                Dependency Hell ... I'm quite familiar with this as a longstanding FreeBSD user. The amount of work they've gone to in order to try and eliminate this is astounding, but I still end up with it once in a while. And, this is pretty much what I have now. I have a "main" site, another with different branding depending on the particulars of the product/category in question, and a third which is the "mobile" version of the first. And I get a little tired of doing everything thrice ... but I also hate when changes to one project's dependent classes cause breakage in another project.

                A "container" is more/less functionally equivalent to a "jail" in FreeBSD, which they've had for quite a bit longer than Docker's existed. But I've never looked into it.

                As for the "only sane way" ... At this stage of the game I commit changes to production servers pretty near daily. I'm not sure we could ever say a product is "finished" enough to deploy all-at-once in a perfectly pristine environment. I WISH we could.

                As for "bloated core interfaces", I'm not TOO scared of that just yet ... what about the concept of configuration via arguments, a la "Functional JavaScript" or Dave Cheney's Functional Options for Friendly APIs?

                Oh, and I really need a vacation. 😛

                  9 days later

                  I'll toss this in the mix: Composer. This is specifically what it's meant for: dependency management. You can generate your core library of code as a separate repository, "publish" it to a composer repository (public or private) and then in your other sites / projects simply do: composer require my-namespace/my-library and you have the latest version of the library.

                  Yes, deploying the updated code will require you to update every project individually; however, it gets around the whole "breaking functionality" thing. Then you can update each site or project as needed and the others can run their own version of the library without fear of breaking due to a separate update.

                  You can run private composer repositories (see: Satis or Private Packagist). I do it for my current employer and it's working very well. Library and project code is in GitHub, dependencies for PHP functionality via Composer, and each time a component has a new release, a simple composer update brings in all the updates for me to test. It's a great little tool, and one PHP needed for a while.

                    Write a Reply...