YKdvd
2013-06-29 13:28:34 UTC
I have a MySQL system where there is a "global" database, and a number of
identical project databases which also reference global. I connect a
SQLAlchemy engine to, say, "project1", and use a __tableargs__ of
{"scheme" : "global"} for tables that are in global. This works fine, but
as the number of projects increases it is becoming a pain to update them
with schema changes, and I'm trying to incorporate Alembic to help out. My
first step has been to run an autogenerate to see where my SQLA definitions
don't yet fully match the original structures I inherited.
Unfortunately, it looks like I can't do quite what I want with stock
alembic. My system's declarative base includes both global and project
definitions; If I set the target_metadata in alembic's env.py to it,
alembic only looks at the tables for "project" on the database and thinks
all the global ones have to be created. I found the "include_schemas"
configuration option, but this tells alembic to use inspection to find all
available schemas, which would pull _all_ project databases and any other
visible databases.
I was thinking of making a patch to auto generate.py, in
_produce_net_changes(). Right now if the configuration option
"include_schemas" is set True, it does the inspection to get all schemas.
I was thinking of doing something like include_schemas = set([None,
'global']) in env.py, and having the code bypass the inspection and use
this value when it is a set instead of a boolean. This seemed to work on a
very quick and dirty test - is there anything that I should watch out for,
or that makes this a bad idea in practice? In theory I assume I could
somehow extract the set of schemas from the metadata.
identical project databases which also reference global. I connect a
SQLAlchemy engine to, say, "project1", and use a __tableargs__ of
{"scheme" : "global"} for tables that are in global. This works fine, but
as the number of projects increases it is becoming a pain to update them
with schema changes, and I'm trying to incorporate Alembic to help out. My
first step has been to run an autogenerate to see where my SQLA definitions
don't yet fully match the original structures I inherited.
Unfortunately, it looks like I can't do quite what I want with stock
alembic. My system's declarative base includes both global and project
definitions; If I set the target_metadata in alembic's env.py to it,
alembic only looks at the tables for "project" on the database and thinks
all the global ones have to be created. I found the "include_schemas"
configuration option, but this tells alembic to use inspection to find all
available schemas, which would pull _all_ project databases and any other
visible databases.
I was thinking of making a patch to auto generate.py, in
_produce_net_changes(). Right now if the configuration option
"include_schemas" is set True, it does the inspection to get all schemas.
I was thinking of doing something like include_schemas = set([None,
'global']) in env.py, and having the code bypass the inspection and use
this value when it is a set instead of a boolean. This seemed to work on a
very quick and dirty test - is there anything that I should watch out for,
or that makes this a bad idea in practice? In theory I assume I could
somehow extract the set of schemas from the metadata.
--
You received this message because you are subscribed to the Google Groups "sqlalchemy-alembic" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sqlalchemy-alembic+***@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
You received this message because you are subscribed to the Google Groups "sqlalchemy-alembic" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sqlalchemy-alembic+***@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.