Imagine a world where your toaster was also a chair. A car also a blender. A bike also a food processor.
These examples, silly as they may be, highlight an essential principle of design: that which does too much does not "feel" designed. Conversely, simplicity feels intentfully, exquisitely produced.
Limiting the purpose of a thing to a fine scope often has benefits. Percieved drawbacks can usually be remedied by composition of another simple design.
An example of simplicity composed can be shown by comparing barbers clipper and barbers trimmers. There are two kinds, limited to their respective purposes. Clippers for the "bulk work", trimmers for the "fine work". The equally excel at their intended purpose and compose into a set of tools which can realize the full extent of those who desire to have a hair cut. Image a single tool which attempted to manifest both purposes in a single tool. My imgaination projects something that is passable at best, but fails to be exceptional, or even good, at either.
The same principle, in my mind at least, applies to data systems. The liberal data systems we have today fail to capture any sort of simplicity strictly because their bounds on their purpose are theoretically limitless. The broader world accessible by only another feature or plugin.
Take a look at the Postgres feature matrix. My cursory glance signals Postgres is a data system which is a complicated swiss army knife. Coupled with the interface being a variant of SQL, complication is ballooned, not reduced. Postgres, in my opinion, is technically an "unlimited" data system. Such limitlessnes is it's achilles heel, not its triumph.
Postgres can do so much but at the cost of not being exceptionally good at any one thing. To give credit where it is due, so much has been built on it. Postgres' merits as a data system used in production do not necessitate a conclusion most come to: it is a sufficient data store on which to model domains. I posit it is not, unless much care is taken. Especially once concurrency controls are factored into the eqation.
The relational algebra (and the manifestation via SQL data stores), while clever and elegant with its levels of normalized forms and mathematically grounded formulae, has manifested as a sub-par technology most people and companies are happy to limp along with, provided there is access to wrapper layers and tooling to push the complexity to a space where no one immediately needs to pay attention.
And here in lies the problem: how can a data system be limited to an essential purpose, yet still meet the everyday needs of businesses and engineers/developers?
How can something be so simple and ubiquitous it does not need an "ecosystem" around it to be useful in the large and the small?
An example of a limited data system is FoundationDB, a key/value store. The complexity is mostly pushed toward administration (i.e. bootstrapping a cluster). However, the data model is simple. Bytes as keys, bytes as values. The entire C API can be digested in less than an hour. FoundationDB's limitations enabled concern to be pushed on how to build a correct and reliable data store (the project's testing method being superb at ensuring correctness in the face of failure). Apple built iCloud on top of FoundationDB. Griffin, a "Bank as a Service", was able to build a datalog store on top of FoundationDB. The thread here is FoundationDB was limited, and therefore, able to be intentfully designed. It's design is able to be composed with other ideas, much like our barbers clippers & trimmers.
We need to strive to design by specifying limitations. Anything short will be a path I do not want to trod.