feast.infra.offline_stores.contrib.athena_offline_store package

Subpackages

Submodules

feast.infra.offline_stores.contrib.athena_offline_store.athena module

class feast.infra.offline_stores.contrib.athena_offline_store.athena.AthenaOfflineStore[source]

Bases: OfflineStore

static get_historical_features(config: RepoConfig, feature_views: List[FeatureView], feature_refs: List[str], entity_df: DataFrame | str, registry: Registry, project: str, full_feature_names: bool = False) RetrievalJob[source]

Retrieves the point-in-time correct historical feature values for the specified entity rows.

Parameters:
  • config – The config for the current feature store.

  • feature_views – A list containing all feature views that are referenced in the entity rows.

  • feature_refs – The features to be retrieved.

  • entity_df – A collection of rows containing all entity columns on which features need to be joined, as well as the timestamp column used for point-in-time joins. Either a pandas dataframe can be provided or a SQL query.

  • registry – The registry for the current feature store.

  • project – Feast project to which the feature views belong.

  • full_feature_names – If True, feature names will be prefixed with the corresponding feature view name, changing them from the format “feature” to “feature_view__feature” (e.g. “daily_transactions” changes to “customer_fv__daily_transactions”).

Returns:

A RetrievalJob that can be executed to get the features.

static pull_all_from_table_or_query(config: RepoConfig, data_source: DataSource, join_key_columns: List[str], feature_name_columns: List[str], timestamp_field: str, start_date: datetime, end_date: datetime) RetrievalJob[source]

Extracts all the entity rows (i.e. the combination of join key columns, feature columns, and timestamp columns) from the specified data source that lie within the specified time range.

All of the column names should refer to columns that exist in the data source. In particular, any mapping of column names must have already happened.

Parameters:
  • config – The config for the current feature store.

  • data_source – The data source from which the entity rows will be extracted.

  • join_key_columns – The columns of the join keys.

  • feature_name_columns – The columns of the features.

  • timestamp_field – The timestamp column.

  • start_date – The start of the time range.

  • end_date – The end of the time range.

Returns:

A RetrievalJob that can be executed to get the entity rows.

static pull_latest_from_table_or_query(config: RepoConfig, data_source: DataSource, join_key_columns: List[str], feature_name_columns: List[str], timestamp_field: str, created_timestamp_column: str | None, start_date: datetime, end_date: datetime) RetrievalJob[source]

Extracts the latest entity rows (i.e. the combination of join key columns, feature columns, and timestamp columns) from the specified data source that lie within the specified time range.

All of the column names should refer to columns that exist in the data source. In particular, any mapping of column names must have already happened.

Parameters:
  • config – The config for the current feature store.

  • data_source – The data source from which the entity rows will be extracted.

  • join_key_columns – The columns of the join keys.

  • feature_name_columns – The columns of the features.

  • timestamp_field – The timestamp column, used to determine which rows are the most recent.

  • created_timestamp_column – The column indicating when the row was created, used to break ties.

  • start_date – The start of the time range.

  • end_date – The end of the time range.

Returns:

A RetrievalJob that can be executed to get the entity rows.

static write_logged_features(config: RepoConfig, data: Table | Path, source: LoggingSource, logging_config: LoggingConfig, registry: BaseRegistry)[source]

Writes logged features to a specified destination in the offline store.

If the specified destination exists, data will be appended; otherwise, the destination will be created and data will be added. Thus this function can be called repeatedly with the same destination to flush logs in chunks.

Parameters:
  • config – The config for the current feature store.

  • data – An arrow table or a path to parquet directory that contains the logs to write.

  • source – The logging source that provides a schema and some additional metadata.

  • logging_config – A LoggingConfig object that determines where the logs will be written.

  • registry – The registry for the current feature store.

class feast.infra.offline_stores.contrib.athena_offline_store.athena.AthenaOfflineStoreConfig(*, type: typing_extensions.Literal[athena] = 'athena', data_source: StrictStr, region: StrictStr, database: StrictStr, workgroup: StrictStr, s3_staging_location: StrictStr)[source]

Bases: FeastConfigBaseModel

Offline store config for AWS Athena

data_source: StrictStr

athena data source ex) AwsDataCatalog

database: StrictStr

Athena database name

region: StrictStr

Athena’s AWS region

s3_staging_location: StrictStr

S3 path for importing & exporting data to Athena

type: typing_extensions.Literal[athena]

Offline store type selector

workgroup: StrictStr

Athena workgroup name

class feast.infra.offline_stores.contrib.athena_offline_store.athena.AthenaRetrievalJob(query: str | Callable[[], AbstractContextManager[str]], athena_client, s3_resource, config: RepoConfig, full_feature_names: bool, on_demand_feature_views: List[OnDemandFeatureView] | None = None, metadata: RetrievalMetadata | None = None)[source]

Bases: RetrievalJob

property full_feature_names: bool

Returns True if full feature names should be applied to the results of the query.

get_temp_s3_path() str[source]
get_temp_table_dml_header(temp_table_name: str, temp_external_location: str) str[source]
property metadata: RetrievalMetadata | None

Returns metadata about the retrieval job.

property on_demand_feature_views: List[OnDemandFeatureView]

Returns a list containing all the on demand feature views to be handled.

persist(storage: SavedDatasetStorage, allow_overwrite: bool | None = False, timeout: int | None = None)[source]

Synchronously executes the underlying query and persists the result in the same offline store at the specified destination.

Parameters:
  • storage – The saved dataset storage object specifying where the result should be persisted.

  • allow_overwrite – If True, a pre-existing location (e.g. table or file) can be overwritten. Currently not all individual offline store implementations make use of this parameter.

to_athena(table_name: str) None[source]

feast.infra.offline_stores.contrib.athena_offline_store.athena_source module

class feast.infra.offline_stores.contrib.athena_offline_store.athena_source.AthenaLoggingDestination(*, table_name: str)[source]

Bases: LoggingDestination

classmethod from_proto(config_proto: LoggingConfig) LoggingDestination[source]
table_name: str
to_data_source() DataSource[source]

Convert this object into a data source to read logs from an offline store.

to_proto() LoggingConfig[source]
class feast.infra.offline_stores.contrib.athena_offline_store.athena_source.AthenaOptions(table: str | None, query: str | None, database: str | None, data_source: str | None)[source]

Bases: object

Configuration options for a Athena data source.

classmethod from_proto(athena_options_proto: AthenaOptions)[source]

Creates a AthenaOptions from a protobuf representation of a Athena option.

Parameters:

athena_options_proto – A protobuf representation of a DataSource

Returns:

A AthenaOptions object based on the athena_options protobuf.

to_proto() AthenaOptions[source]

Converts an AthenaOptionsProto object to its protobuf representation.

Returns:

A AthenaOptionsProto protobuf.

class feast.infra.offline_stores.contrib.athena_offline_store.athena_source.AthenaSource(*, timestamp_field: str | None = '', table: str | None = None, database: str | None = None, data_source: str | None = None, created_timestamp_column: str | None = None, field_mapping: Dict[str, str] | None = None, date_partition_column: str | None = None, query: str | None = None, name: str | None = None, description: str | None = '', tags: Dict[str, str] | None = None, owner: str | None = '')[source]

Bases: DataSource

created_timestamp_column: str
property data_source

Returns the Athena data_source of this Athena source.

property database

Returns the database of this Athena source.

date_partition_column: str
description: str
field_mapping: Dict[str, str]
static from_proto(data_source: DataSource)[source]

Creates a AthenaSource from a protobuf representation of a AthenaSource.

Parameters:

data_source – A protobuf representation of a AthenaSource

Returns:

A AthenaSource object based on the data_source protobuf.

get_table_column_names_and_types(config: RepoConfig) Iterable[Tuple[str, str]][source]

Returns a mapping of column names to types for this Athena source.

Parameters:

config – A RepoConfig describing the feature repo

get_table_query_string(config: RepoConfig | None = None) str[source]

Returns a string that can directly be used to reference this table in SQL.

name: str
owner: str
property query

Returns the Athena query of this Athena source.

static source_datatype_to_feast_value_type() Callable[[str], ValueType][source]

Returns the callable method that returns Feast type given the raw column type.

property table

Returns the table of this Athena source.

tags: Dict[str, str]
timestamp_field: str
to_proto() DataSource[source]

Converts a RedshiftSource object to its protobuf representation.

Returns:

A DataSourceProto object.

validate(config: RepoConfig)[source]

Validates the underlying data source.

Parameters:

config – Configuration object used to configure a feature store.

class feast.infra.offline_stores.contrib.athena_offline_store.athena_source.SavedDatasetAthenaStorage(table_ref: str, query: str | None = None, database: str | None = None, data_source: str | None = None)[source]

Bases: SavedDatasetStorage

athena_options: AthenaOptions
static from_proto(storage_proto: SavedDatasetStorage) SavedDatasetStorage[source]
to_data_source() DataSource[source]
to_proto() SavedDatasetStorage[source]

Module contents