I have this multiple orm model for fastapi endpoint like this:
@mapper_registry.mapped
@dataclass
class Department:
__tablename__ = "department"
__sa_dataclass_metadata_key__ = "sa"
id: float = field(
init=False, metadata={"sa": Column(NUMBER(15, 0, False), primary_key=True)})
label: str = field(metadata={"sa": Column(VARCHAR(255), nullable=False)})
active: str = field(
metadata={"sa": Column(CHAR(1), nullable=False, server_default=text("'Y' "))})
sort_order: Optional[float] = field(
default=None, metadata={"sa": Column(NUMBER(4, 0, False))})
entry_key: Optional[str] = field(
default=None, metadata={"sa": Column(VARCHAR(255))})
external_key: Optional[str] = field(
default=None, metadata={"sa": Column(VARCHAR(255), server_default=text("null"))})
date_created: datetime = field(metadata={"sa": Column(TIMESTAMP(True), nullable=False)})
date_modified: datetime = field(metadata={"sa": Column(TIMESTAMP(True), nullable=False)})
And also pydantic model equivalent.
from pydantic import BaseModel
class DepartmentResponse(BaseModel):
id: float
label: str
active: str
date_created: datetime
date_modified: datetime
I am using python oracledb. The original issue is that Fetching timestamp with time zone (date_modified and date_created) in this case raises error ORA-01805: possible error in date/time operation https://github.com/oracle/python-oracledb/issues/20. The error have to do with internal implementation of oracle timestamp with timezone.
Basically, I am trying to explore the workaround of casting date_created and date_modified to string before fetching.
This is just a simple example. I have tables with nested object (table) and large number of columns, so I don't want to call them one by one just to solve this timestamp fetching issues and I am trying to avoid multiple join statements as much as possible (The major reason I am using sqlalchemy/orm in the first place)
Maybe my question should have been to pre-cast these timestamp with timezone fields before fetching them with the reset of the fields. casting to timestamp to string (func.to_char) during fetch works for the timestamp fields.
Or largely a way to solve the timestamp fetching error.
You need to understand that when you use an ORM, it is for the most part the ORM that writes queries. That is in fact one of the major selling points. To accomplish what you describe, you need to
Avoid mapping
COLUMN2itself to your model class for its table. This will mean, of course, that you cannot set values for that column via the ORM, but that should be fine if you're relying on triggers in the DB to maintain the columns in question.Supposing that you need that column's data at all, instruct your ORM to map the expression
CAST(COLUMN2 as INT)to an attribute of your model class instances.Since you're using SQLAlchemy's declarative mapping style, you should get only those mappings you declare, so (1) should be easy.
SQLAlchemy's docs, though voluminous, are a little opaque on how you would accomplish (2), but probably it would be by using a
column_property()in place of aColumn()/mapped_column(). You may even be able to specify the SQL expression as a plain Python string, something like:HOWEVER, I'm not following how any of what you show in the question lines up with ...
To solve this at the Python / SQL interface, you absolutely need to individually map (at least) every column you want to interact with on the Python side, and the mapping example you show seems to indicate that you already recognize and are trying to do that.