I totally agree on investing in a sane data model upfront. So many production systems have schemas that only made sense to the engineer that created them. I would be delighted if I can read a schema and understand what a column means without having to dig through a bunch of migration PRs.
I recently encountered `is_as BOOL` in an important table. After way too much invested time we found out it meant "is active service". </DDL rant>
I think the best db schema I had the displeasure of working with was one where it was a requirement that every table and column name NOT have vowels, except for the few that could, and "the few that could" were governed entirely by a spreadsheet owned by the DB admin.
And so you got tables like LANDMRK and columns like RCR_RCRDR.
> I recently encountered `is_as BOOL` in an important table.
Sounds like a table designed by Forrest Gump.
Postgres has COMMENT ON to help with this but descriptive names are helpful.
I integrate with many ERPs and this is the bane of my existence.
One of the worst has field names like `ft_0001...N` and table names like `UNCC_00001...N`, all in `text` fields (even numbers!), zero FK, almost no indexes and what are views?
The other has this funny field that is a blob that need decoding using a specific FreePascal version. The field? Where is the price of the product.
Other has, in the same column, mix of how handling "," or "." for numbers and I need to check the digital places to deduce which.
FUN.
P.D: I normalize all this Erps into my own schema and has get praise for things like, my product table is called products.