Discovering limitations of Postgres...
This is probably going to come off more as a rant than anything, apologies now...
On and off for the last 5 years I've had a personal project (eventually I'll go into detail) I pick up for a bit when i decide it's time to give it another shot to completion.
The first rendition was done with a traditional LAMP stack, with codeigniter, twig and a few other things. It took me about 4 months to get the project to 70%, putting in a couple hours here and there. A vast majority of that time was spent learning a lot of the stack to the point I could do something functional with it. Unfortunately, I didn't have the bandwidth to bring it to completion, as I'm allergic to JavaScript and to do what I wanted was outside of my skill level at the time.
Fast-Forward to 2019. I've started a new job, I'm living on my own, and I find just enough free time to try again but differently. This time I choose the stack to be Python framwork Falcon with a Postgres database, mostly based off of performance benchmarks i've seen (all benchmarks are lies, don't listen to them), while finally embracing that Docker lifestyle and made it probably 40% of the way to completion. I had inserting and fetching based off of api endpoints working, but little else. This time I decided to only focus on backend work instead of trying to make a partially working frontend as well. This time it was shelved due to personal things keeping me from wanting to explore more of what I was writing.
That brings us to current day. I've been spending a lot of my work time dealing with APIs and their design and decided I should practice what I preach and start over. Starting over was mostly because trying to pick up the Falcon framework again after working in Bottle was a pain. And the performance of Bottle had mostly caught up to Falcon, to the point it's not a big difference. I kept the same backend library I had been writing, so everything was mostly plug and play and got back to the same 40% in less than a day. And that's when the fire nation attacked........

Just kidding. I spent whatever free time and energy I had improving on the backend code, reaching what I would consider feature parody to other similar projects I know of. Part of this work includes a type of expiration table, to be driven by a stored procedure to routinely delete data based on values in the main table. This was always in the design of the project and probably considered crucial in the end product. I hadn't done the stored procedure as part of the LAMP stack because i was focused on getting the frontend where I wanted it instead of that functionality, but I was familiar of how to do what I wanted, so it was a non-issue.
This is when I started to hit into my issues. Stored procedure to delete items? Easy Peasy. Option to schedule stored procedure? 404...
Wait what? Yes, by default, Postgres doesn't contain the option to schedule a stored procedure internally. Missing this when I decided to pick Postgres is on me, but in defense, I never dreamed that I'd have to include "can schedule stored procedures" in my look for other RDBMS options outside of MySQL. I had worked with MSSQL and assumed anything in this day and age would allow this (besides sqlite, but that's expected based on design since it's not a client-server database).
I spent the time doing the research for my options if I wanted to stick with Postgres:
- try to bake a cron to run inside the postgres container
- create a function inside my backend code to run on a cronschedule on a different process thread to trigger the stored procedure
- ditch the concept of scheduling, make the application delete the rows on fetch if the citeria is met (slow and expensive)
- install the pg_cron extension as part of the postgres image and bake the schedule as part of the inital sql that runs when spunt up with docker-compose
I decided the most portable option was number 3. If I chose to switch RDBMS platforms it would require the least amount of work to migrate. Honestly it went great until I tried to activate the schedule.
2021-07-25 22:29:49.825 UTC [84] ERROR: can only create extension in database postgres
2021-07-25 22:29:49.825 UTC [84] DETAIL: Jobs must be scheduled from the database configured in cron.database_name, since the pg_cron background worker reads job descriptions from this database.
2021-07-25 22:29:49.825 UTC [84] HINT: Add cron.database_name = 'database_name' in postgresql.conf to use the current database.
2021-07-25 22:29:49.825 UTC [84] CONTEXT: PL/pgSQL function inline_code_block line 4 at RAISE
2021-07-25 22:29:49.825 UTC [84] STATEMENT: CREATE EXTENSION pg_cron;
psql:/docker-entrypoint-initdb.d/init.sql:3: ERROR: can only create extension in database postgres
DETAIL: Jobs must be scheduled from the database configured in cron.database_name, since the pg_cron background worker reads job descriptions from this database.
HINT: Add cron.database_name = 'database_name' in postgresql.conf to use the current database.Well ..... Shit.
So for most installs of postgrest, that doesnt seem to be that big of a deal. Just use the default database as your database in a container, and off to the races, right? Well it used to be bad practice to build your application in the default database, hence me defaulting to using database_name. What about having your inital script create database_name, switch to it and then setup the tables and whatnot?

Postgres doesn't have or understand the USE command. everything talks about while connecting with psql to do \c database_name and then run the script, or imbed the switch and script you want to run inside another script.
No thank you... Not after spending 2 hours on getting both a working stored procedure and going down the rabbit hole to find this out.
I'll say again, these limitations are things I should have discovered much earlier into the process, and it's my fault for not doing the footwork until most of my design was set in stone.
For this project I'll be giving up Postgres, but only because I don't want to jump through these hoops to keep using it for this project. I'm sure I'll have other things I'll work on where it makes perfect sense. So, for now, goodbye Postgres ... it was kinda fun working with you.