Loose Coupling is an old electrical engineering concept applied to computer science and used mostly when talking about object oriented architectures. To whit: objects are loosely coupled when there is a weak dependency - little to no knowledge - about components in a system. It's not encapsulation vs non-encapsulation -- knowledge of the inside of a class vs its public methods. It's knowledge about the class itself.
In highly threaded OO, loose coupling of systems is a holy grail. The less interdependent a set of objects within an overall system are, the more the system can expand and plan for change. (This is the core mantra of architecture -- plan for change.) If one set of classes over here, say, can change the way they work and a set of classes over there can still use the data produced without having any knowledge of the changes, the system has an overall long-term stability.
But to hell with it. Table flip. Giant monolithic OO systems are dead. Functional programming, which was dead once, is alive again, like the mathematically-based computer science zombie it is.
Loosely couple all the things!
We're no longer loosely coupling objects while adhering to good SOLID principles. We're loosely coupling entire systems. Fun things to divorce from other parts of a system:
* Whole chunks of software architecture! We are calling it today microservices but it's just loosely coupled objects taken to an extreme. Fragments of applications get broken out into small single serving REST services with coordination through queuing (services talk to workers who do things to databases) or caching (services talk to complex caching stores to send data to each other). Why call methods on classes when one can fire and forget a message and expect Zookeeper or AMQP to get it there? We can do it with events, baby.
* Configuration! Configuration management systems (chef, puppet, ansible, saltsack, your favorite shell scripts) has configuration over... here... while the systems exist over... there. Systems are described by their configuration but no longer defined by their configuration. And this is how God intended when He created configuration files. Now that loose coupling is managed.
* Build artifacts! A build systems -- ha ha! -- kicks out things. Otherwise it's not a very good build system because it isn't building anything. Build systems have gotten smart and cunning and can now separate apart the things they build from the systems they stand up. Gone are the days of capistrano scripts dipping into git (unless you still do that and you should not do that because that is bad on many levels not the least of which security) to roll builds but now we can smoothly loosely couple the code from the artifacts the build system rolls and the configuration system.
* Server builds! Vagrant now loosely couples the build artifacts and the configuration from the actual server build. Now server builds are fancy vagrant files all held together by some high level scripting. Wrap it with docker (oooh docker) and now we have highly loose decoupling of servers, systems, configuration, code, and virtualization in little container bags.
* Entire servers! We're rapidly moving away from a world of huge datacenters and static servers which hang around for years and get gunky. Highly temporal servers themselves are loosely coupled from the code, the system and, in fact, each other. And why have huge servers running giant vertical workloads when one can have 1000 cores on single servers running Hadoop that can be stood up and shut down at a whim? Having the networks/network architecture/servers/server architecture now decoupled from itself and everything around it forces a sort of fragmented, bite sized architecture all the way up the stack.
We wander on embracing a sort of mad ephemerality where everything is so loosely coupled -- the code from itself, the code from its configuration, the code from its servers to its virtualization to its network hardware -- everything becomes a sort of slurry haze of pieces that do not depend on other pieces in a strong way except for long strings of data. And this drives a certain kind of thinking about systems where everything should be independent of everything else but still come together as a coherent whole from a large distance.
Instead of cloud computing it should be called particle computing. Really, it's not a cloud. It's lots of little tiny itty bitty bits held together with clever queuing and caching to make it look like a whole. Kind of like the core principles of matter with atoms held together into complex molecules with, you know, clever uses of Cassandra instead of the strong nuclear force.
Mad rant off.