Timeseries modeling from a pandas dataframe #639
-
First Check
Commit to Help
Example Codeimport pandas as pd
from sqlmodel import Field, Session, SQLModel, create_engine
from datetime import datetime
class StockTimeSeries(SQLModel, table=True):
id: int = Field(primary_key=True)
symbol: str
timestamp: datetime
price: float
sqlite_file_name = "stock_database.db"
sqlite_url = f"sqlite:///{sqlite_file_name}"
engine = create_engine(sqlite_url, echo=True)
def create_db_and_tables():
SQLModel.metadata.create_all(engine)
def insert_dataframe_to_db(df):
with Session(engine) as session:
for _, row in df.iterrows():
time_series_entry = StockTimeSeries(
id=row["id"],
symbol=row["symbol"],
timestamp=row["timestamp"],
price=row["price"],
)
session.add(time_series_entry)
session.commit()
def main():
create_db_and_tables()
# Simulating a Pandas DataFrame for demonstration
data = {
"id": [1, 2, 3],
"symbol": ["AAPL", "AAPL", "AAPL"],
"timestamp": [
datetime(2023, 8, 1, 9, 0),
datetime(2023, 8, 1, 10, 0),
datetime(2023, 8, 1, 11, 0),
],
"price": [150.0, 155.0, 160.0],
}
df = pd.DataFrame(data)
insert_dataframe_to_db(df)
if __name__ == "__main__":
main() DescriptionHi, I just started using sqlmodel and I would like to know if this way of managing time series is the optimal one.
Is this correct or there are some other way to solve this requirement? Thanks. Operating SystemWindows Operating System DetailsNo response SQLModel Version0.0.8 Python Version3.9 Additional ContextNo response |
Beta Was this translation helpful? Give feedback.
Answered by
YuriiMotov
Aug 19, 2025
Replies: 1 comment
-
Your approach looks good. For big data volume, you can use bulk insert: data = [
{
"id": row["id"],
"symbol": row["symbol"],
...
}
for _, row in df.iterrows()
]
session.exec(StockTimeSeries, params=data)
session.commit() |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
YuriiMotov
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Your approach looks good.
For big data volume, you can use bulk insert: