pudl.io_managers

Dagster IO Managers.

Module Contents

Classes

PudlMixedFormatIOManager

Format switching IOManager that supports sqlite and parquet.

SQLiteIOManager

IO Manager that writes and retrieves dataframes from a SQLite database.

PudlParquetIOManager

IOManager that writes pudl tables to pyarrow parquet files.

PudlSQLiteIOManager

IO Manager that writes and retrieves dataframes from a SQLite database.

FercSQLiteIOManager

IO Manager for reading tables from FERC databases.

FercDBFSQLiteIOManager

IO Manager for only reading tables from the FERC 1 database.

FercXBRLSQLiteIOManager

IO Manager for only reading tables from the XBRL database.

EpaCemsIOManager

An IO Manager that dumps outputs to a parquet file.

Functions

get_table_name_from_context(→ str)

Retrieves the table name from the context object.

pudl_mixed_format_io_manager(→ dagster.IOManager)

Create a SQLiteManager dagster resource for the pudl database.

ferc1_dbf_sqlite_io_manager(→ FercDBFSQLiteIOManager)

Create a SQLiteManager dagster resource for the ferc1 dbf database.

ferc1_xbrl_sqlite_io_manager(→ FercXBRLSQLiteIOManager)

Create a SQLiteManager dagster resource for the ferc1 dbf database.

epacems_io_manager(→ EpaCemsIOManager)

IO Manager that writes EPA CEMS partitions to individual parquet files.

Attributes

pudl.io_managers.logger[source]
pudl.io_managers.MINIMUM_SQLITE_VERSION = '3.32.0'[source]
pudl.io_managers.get_table_name_from_context(context: dagster.OutputContext) str[source]

Retrieves the table name from the context object.

class pudl.io_managers.PudlMixedFormatIOManager(write_to_parquet: bool = False, read_from_parquet: bool = False)[source]

Bases: dagster.IOManager

Format switching IOManager that supports sqlite and parquet.

This IOManager allows experimental output of parquet files along the standard sqlite database produced by PUDL. During this experimental phase sqlite will always be output, while parquet support will be turned off by default.

Parquet support can be enabled either using environment variables or the dagster UI (see pudl_mixed_format_io_manager() for more info on the enviroment variables). Parquet writing and reading can both be toggled independently. If parquet writing is enabled, both parquet and sqlite tables will be produced, while if parquet reading is enabled, assets will only be read from the parquet files.

write_to_parquet: bool[source]

If true, data will be written to parquet files.

read_from_parquet: bool[source]

If true, data will be read from parquet files instead of sqlite.

handle_output(context: dagster.OutputContext, obj: pandas.DataFrame | str) pandas.DataFrame[source]

Passes the output to the appropriate IO manager instance.

load_input(context: dagster.InputContext) pandas.DataFrame[source]

Reads input from the appropriate IO manager instance.

class pudl.io_managers.SQLiteIOManager(base_dir: str, db_name: str, md: sqlalchemy.MetaData | None = None, timeout: float = 1000.0)[source]

Bases: dagster.IOManager

IO Manager that writes and retrieves dataframes from a SQLite database.

_setup_database(timeout: float = 1000.0) sqlalchemy.Engine[source]

Create database and metadata if they don’t exist.

Parameters:

timeout – How many seconds the connection should wait before raising an exception, if the database is locked by another connection. If another connection opens a transaction to modify the database, it will be locked until that transaction is committed.

Returns:

SQL Alchemy engine that connects to a database in the base_dir.

Return type:

engine

_get_sqlalchemy_table(table_name: str) sqlalchemy.Table[source]

Get SQL Alchemy Table object from metadata given a table_name.

Parameters:

table_name – The name of the table to look up.

Returns:

Corresponding SQL Alchemy Table in SQLiteIOManager metadata.

Return type:

table

Raises:

ValueError – if table_name does not exist in the SQLiteIOManager metadata.

_handle_pandas_output(context: dagster.OutputContext, df: pandas.DataFrame)[source]

Write dataframe to the database.

SQLite does not support concurrent writes to the database. Instead, SQLite queues write transactions and executes them one at a time. This allows the assets to be processed in parallel. See the SQLAlchemy docs to learn more about SQLite concurrency.

Parameters:
  • context – dagster keyword that provides access to output information like asset name.

  • df – dataframe to write to the database.

_handle_str_output(context: dagster.OutputContext, query: str)[source]

Execute a sql query on the database.

This is used for creating output views in the database.

Parameters:
  • context – dagster keyword that provides access output information like asset name.

  • query – sql query to execute in the database.

handle_output(context: dagster.OutputContext, obj: pandas.DataFrame | str)[source]

Handle an op or asset output.

If the output is a dataframe, write it to the database. If it is a string execute it as a SQL query.

Parameters:
  • context – dagster keyword that provides access output information like asset name.

  • obj – a sql query or dataframe to add to the database.

Raises:

Exception – if an asset or op returns an unsupported datatype.

load_input(context: dagster.InputContext) pandas.DataFrame[source]

Load a dataframe from a sqlite database.

Parameters:

context – dagster keyword that provides access output information like asset name.

class pudl.io_managers.PudlParquetIOManager[source]

Bases: dagster.IOManager

IOManager that writes pudl tables to pyarrow parquet files.

handle_output(context: dagster.OutputContext, df: Any) None[source]

Writes pudl dataframe to parquet file.

load_input(context: dagster.InputContext) pandas.DataFrame[source]

Loads pudl table from parquet file.

class pudl.io_managers.PudlSQLiteIOManager(base_dir: str, db_name: str, package: pudl.metadata.classes.Package | None = None, timeout: float = 1000.0)[source]

Bases: SQLiteIOManager

IO Manager that writes and retrieves dataframes from a SQLite database.

This class extends the SQLiteIOManager class to manage database metadata and dtypes using the pudl.metadata.classes.Package class.

_handle_str_output(context: dagster.OutputContext, query: str)[source]

Execute a sql query on the database.

This is used for creating output views in the database.

Parameters:
  • context – dagster keyword that provides access output information like asset name.

  • query – sql query to execute in the database.

_handle_pandas_output(context: dagster.OutputContext, df: pandas.DataFrame)[source]

Enforce PUDL DB schema and write dataframe to SQLite.

load_input(context: dagster.InputContext) pandas.DataFrame[source]

Load a dataframe from a sqlite database.

Parameters:

context – dagster keyword that provides access output information like asset name.

pudl.io_managers.pudl_mixed_format_io_manager(init_context) dagster.IOManager[source]

Create a SQLiteManager dagster resource for the pudl database.

class pudl.io_managers.FercSQLiteIOManager(base_dir: str = None, db_name: str = None, md: sqlalchemy.MetaData = None, timeout: float = 1000.0)[source]

Bases: SQLiteIOManager

IO Manager for reading tables from FERC databases.

This class should be subclassed and the load_input and handle_output methods should be implemented.

This IOManager exepcts the database to already exist.

_setup_database(timeout: float = 1000.0) sqlalchemy.Engine[source]

Create database engine and read the metadata.

Parameters:

timeout – How many seconds the connection should wait before raising an exception, if the database is locked by another connection. If another connection opens a transaction to modify the database, it will be locked until that transaction is committed.

Returns:

SQL Alchemy engine that connects to a database in the base_dir.

Return type:

engine

abstract handle_output(context: dagster.OutputContext, obj)[source]

Handle an op or asset output.

abstract load_input(context: dagster.InputContext) pandas.DataFrame[source]

Load a dataframe from a sqlite database.

Parameters:

context – dagster keyword that provides access output information like asset name.

class pudl.io_managers.FercDBFSQLiteIOManager(base_dir: str = None, db_name: str = None, md: sqlalchemy.MetaData = None, timeout: float = 1000.0)[source]

Bases: FercSQLiteIOManager

IO Manager for only reading tables from the FERC 1 database.

This IO Manager is for reading data only. It does not handle outputs because the raw FERC tables are not known prior to running the ETL and are not recorded in our metadata.

abstract handle_output(context: dagster.OutputContext, obj: pandas.DataFrame | str)[source]

Handle an op or asset output.

load_input(context: dagster.InputContext) pandas.DataFrame[source]

Load a dataframe from a sqlite database.

Parameters:

context – dagster keyword that provides access output information like asset name.

pudl.io_managers.ferc1_dbf_sqlite_io_manager(init_context) FercDBFSQLiteIOManager[source]

Create a SQLiteManager dagster resource for the ferc1 dbf database.

class pudl.io_managers.FercXBRLSQLiteIOManager(base_dir: str = None, db_name: str = None, md: sqlalchemy.MetaData = None, timeout: float = 1000.0)[source]

Bases: FercSQLiteIOManager

IO Manager for only reading tables from the XBRL database.

This IO Manager is for reading data only. It does not handle outputs because the raw FERC tables are not known prior to running the ETL and are not recorded in our metadata.

static filter_for_freshest_data(table: pandas.DataFrame, primary_key: list[str]) pandas.DataFrame[source]

Get most updated values for each XBRL context.

An XBRL context includes an entity ID, the time period the data applies to, and other dimensions such as utility type. Each context has its own ID, but they are frequently redefined with the same contents but different IDs - so we identify them by their actual content.

Each row in our SQLite database includes all the facts for one context/filing pair.

If one context is represented in multiple filings, we take the most recently-reported non-null value.

This means that if a utility reports a non-null value, then later either reports a null value for it or simply omits it from the report, we keep the old non-null value, which may be erroneous. This appears to be fairly rare, affecting < 0.005% of reported values.

static refine_report_year(df: pandas.DataFrame, xbrl_years: list[int]) pandas.DataFrame[source]

Set a fact’s report year by its actual dates.

Sometimes a fact belongs to a context which has no ReportYear associated with it; other times there are multiple ReportYears associated with a single filing. In these cases the report year of a specific fact may be associated with the other years in the filing.

In many cases we can infer the actual report year from the fact’s associated time period - either duration or instant.

_get_primary_key(sched_table_name: str) list[str][source]
abstract handle_output(context: dagster.OutputContext, obj: pandas.DataFrame | str)[source]

Handle an op or asset output.

load_input(context: dagster.InputContext) pandas.DataFrame[source]

Load a dataframe from a sqlite database.

Parameters:

context – dagster keyword that provides access output information like asset name.

pudl.io_managers.ferc1_xbrl_sqlite_io_manager(init_context) FercXBRLSQLiteIOManager[source]

Create a SQLiteManager dagster resource for the ferc1 dbf database.

class pudl.io_managers.EpaCemsIOManager(base_path: upath.UPath, schema: pyarrow.Schema)[source]

Bases: dagster.UPathIOManager

An IO Manager that dumps outputs to a parquet file.

extension: str = '.parquet'[source]
abstract dump_to_path(context: dagster.OutputContext, obj: dask.dataframe.DataFrame, path: upath.UPath)[source]

Write dataframe to parquet file.

load_from_path(context: dagster.InputContext, path: upath.UPath) dask.dataframe.DataFrame[source]

Load a directory of parquet files to a dask dataframe.

pudl.io_managers.epacems_io_manager(init_context: dagster.InitResourceContext) EpaCemsIOManager[source]

IO Manager that writes EPA CEMS partitions to individual parquet files.