feast package

Subpackages

Submodules

feast.cli module

class feast.cli.NoOptionDefaultFormat(name: Optional[str], context_settings: Optional[Dict[str, Any]] = None, callback: Optional[Callable[[...], Any]] = None, params: Optional[List[click.core.Parameter]] = None, help: Optional[str] = None, epilog: Optional[str] = None, short_help: Optional[str] = None, options_metavar: Optional[str] = '[OPTIONS]', add_help_option: bool = True, no_args_is_help: bool = False, hidden: bool = False, deprecated: bool = False)[source]

Bases: click.core.Command

format_options(ctx: click.core.Context, formatter: click.formatting.HelpFormatter)[source]

Writes all the options into the formatter if they exist.

feast.client module

feast.config module

feast.constants module

feast.data_format module

class feast.data_format.AvroFormat(schema_json: str)[source]

Bases: feast.data_format.StreamFormat

Defines the Avro streaming data format that encodes data in Avro format

to_proto()[source]

Convert this StreamFormat into its protobuf representation.

class feast.data_format.FileFormat[source]

Bases: abc.ABC

Defines an abtract file forma used to encode feature data in files

classmethod from_proto(proto)[source]

Construct this FileFormat from its protobuf representation. Raises NotImplementedError if FileFormat specified in given proto is not supported.

abstract to_proto()[source]

Convert this FileFormat into its protobuf representation.

class feast.data_format.ParquetFormat[source]

Bases: feast.data_format.FileFormat

Defines the Parquet data format

to_proto()[source]

Convert this FileFormat into its protobuf representation.

class feast.data_format.ProtoFormat(class_path: str)[source]

Bases: feast.data_format.StreamFormat

Defines the Protobuf data format

to_proto()[source]

Convert this StreamFormat into its protobuf representation.

class feast.data_format.StreamFormat[source]

Bases: abc.ABC

Defines an abtracts streaming data format used to encode feature data in streams

classmethod from_proto(proto)[source]

Construct this StreamFormat from its protobuf representation.

abstract to_proto()[source]

Convert this StreamFormat into its protobuf representation.

feast.data_source module

class feast.data_source.DataSource(name: str, event_timestamp_column: Optional[str] = None, created_timestamp_column: Optional[str] = None, field_mapping: Optional[Dict[str, str]] = None, date_partition_column: Optional[str] = None)[source]

Bases: abc.ABC

DataSource that can be used to source features.

Parameters
  • name – Name of data source, which should be unique within a project

  • event_timestamp_column (optional) – Event timestamp column used for point in time joins of feature values.

  • created_timestamp_column (optional) – Timestamp column indicating when the row was created, used for deduplicating rows.

  • field_mapping (optional) – A dictionary mapping of column names in this data source to feature names in a feature table or view. Only used for feature columns, not entity or timestamp columns.

  • date_partition_column (optional) – Timestamp column used for partitioning.

created_timestamp_column: str
date_partition_column: str
event_timestamp_column: str
field_mapping: Dict[str, str]
abstract static from_proto(data_source: feast.core.DataSource_pb2.DataSource) Any[source]

Converts data source config in protobuf spec to a DataSource class object.

Parameters

data_source – A protobuf representation of a DataSource.

Returns

A DataSource class object.

Raises

ValueError – The type of DataSource could not be identified.

get_table_column_names_and_types(config: feast.repo_config.RepoConfig) Iterable[Tuple[str, str]][source]

Returns the list of column names and raw column types.

Parameters

config – Configuration object used to configure a feature store.

get_table_query_string() str[source]

Returns a string that can directly be used to reference this table in SQL.

name: str
abstract static source_datatype_to_feast_value_type() Callable[[str], feast.value_type.ValueType][source]

Returns the callable method that returns Feast type given the raw column type.

abstract to_proto() feast.core.DataSource_pb2.DataSource[source]

Converts a DataSourceProto object to its protobuf representation.

validate(config: feast.repo_config.RepoConfig)[source]

Validates the underlying data source.

Parameters

config – Configuration object used to configure a feature store.

class feast.data_source.KafkaOptions(bootstrap_servers: str, message_format: feast.data_format.StreamFormat, topic: str)[source]

Bases: object

DataSource Kafka options used to source features from Kafka messages

classmethod from_proto(kafka_options_proto: feast.core.DataSource_pb2.KafkaOptions)[source]

Creates a KafkaOptions from a protobuf representation of a kafka option

Parameters

kafka_options_proto – A protobuf representation of a DataSource

Returns

Returns a BigQueryOptions object based on the kafka_options protobuf

to_proto() feast.core.DataSource_pb2.KafkaOptions[source]

Converts an KafkaOptionsProto object to its protobuf representation.

Returns

KafkaOptionsProto protobuf

class feast.data_source.KafkaSource(name: str, event_timestamp_column: str, bootstrap_servers: str, message_format: feast.data_format.StreamFormat, topic: str, created_timestamp_column: Optional[str] = '', field_mapping: Optional[Dict[str, str]] = None, date_partition_column: Optional[str] = '')[source]

Bases: feast.data_source.DataSource

created_timestamp_column: str
date_partition_column: str
event_timestamp_column: str
field_mapping: Dict[str, str]
static from_proto(data_source: feast.core.DataSource_pb2.DataSource)[source]

Converts data source config in protobuf spec to a DataSource class object.

Parameters

data_source – A protobuf representation of a DataSource.

Returns

A DataSource class object.

Raises

ValueError – The type of DataSource could not be identified.

get_table_column_names_and_types(config: feast.repo_config.RepoConfig) Iterable[Tuple[str, str]][source]

Returns the list of column names and raw column types.

Parameters

config – Configuration object used to configure a feature store.

get_table_query_string() str[source]

Returns a string that can directly be used to reference this table in SQL.

name: str
static source_datatype_to_feast_value_type() Callable[[str], feast.value_type.ValueType][source]

Returns the callable method that returns Feast type given the raw column type.

to_proto() feast.core.DataSource_pb2.DataSource[source]

Converts a DataSourceProto object to its protobuf representation.

validate(config: feast.repo_config.RepoConfig)[source]

Validates the underlying data source.

Parameters

config – Configuration object used to configure a feature store.

class feast.data_source.KinesisOptions(record_format: feast.data_format.StreamFormat, region: str, stream_name: str)[source]

Bases: object

DataSource Kinesis options used to source features from Kinesis records

classmethod from_proto(kinesis_options_proto: feast.core.DataSource_pb2.KinesisOptions)[source]

Creates a KinesisOptions from a protobuf representation of a kinesis option

Parameters

kinesis_options_proto – A protobuf representation of a DataSource

Returns

Returns a KinesisOptions object based on the kinesis_options protobuf

to_proto() feast.core.DataSource_pb2.KinesisOptions[source]

Converts an KinesisOptionsProto object to its protobuf representation.

Returns

KinesisOptionsProto protobuf

class feast.data_source.KinesisSource(name: str, event_timestamp_column: str, created_timestamp_column: str, record_format: feast.data_format.StreamFormat, region: str, stream_name: str, field_mapping: Optional[Dict[str, str]] = None, date_partition_column: Optional[str] = '')[source]

Bases: feast.data_source.DataSource

created_timestamp_column: str
date_partition_column: str
event_timestamp_column: str
field_mapping: Dict[str, str]
static from_proto(data_source: feast.core.DataSource_pb2.DataSource)[source]

Converts data source config in protobuf spec to a DataSource class object.

Parameters

data_source – A protobuf representation of a DataSource.

Returns

A DataSource class object.

Raises

ValueError – The type of DataSource could not be identified.

get_table_column_names_and_types(config: feast.repo_config.RepoConfig) Iterable[Tuple[str, str]][source]

Returns the list of column names and raw column types.

Parameters

config – Configuration object used to configure a feature store.

get_table_query_string() str[source]

Returns a string that can directly be used to reference this table in SQL.

name: str
static source_datatype_to_feast_value_type() Callable[[str], feast.value_type.ValueType][source]

Returns the callable method that returns Feast type given the raw column type.

to_proto() feast.core.DataSource_pb2.DataSource[source]

Converts a DataSourceProto object to its protobuf representation.

validate(config: feast.repo_config.RepoConfig)[source]

Validates the underlying data source.

Parameters

config – Configuration object used to configure a feature store.

class feast.data_source.RequestDataSource(name: str, schema: Dict[str, feast.value_type.ValueType])[source]

Bases: feast.data_source.DataSource

RequestDataSource that can be used to provide input features for on demand transforms

Parameters
  • name – Name of the request data source

  • schema – Schema mapping from the input feature name to a ValueType

static from_proto(data_source: feast.core.DataSource_pb2.DataSource)[source]

Converts data source config in protobuf spec to a DataSource class object.

Parameters

data_source – A protobuf representation of a DataSource.

Returns

A DataSource class object.

Raises

ValueError – The type of DataSource could not be identified.

get_table_column_names_and_types(config: feast.repo_config.RepoConfig) Iterable[Tuple[str, str]][source]

Returns the list of column names and raw column types.

Parameters

config – Configuration object used to configure a feature store.

get_table_query_string() str[source]

Returns a string that can directly be used to reference this table in SQL.

name: str
schema: Dict[str, feast.value_type.ValueType]
static source_datatype_to_feast_value_type() Callable[[str], feast.value_type.ValueType][source]

Returns the callable method that returns Feast type given the raw column type.

to_proto() feast.core.DataSource_pb2.DataSource[source]

Converts a DataSourceProto object to its protobuf representation.

validate(config: feast.repo_config.RepoConfig)[source]

Validates the underlying data source.

Parameters

config – Configuration object used to configure a feature store.

class feast.data_source.SourceType(value)[source]

Bases: enum.Enum

DataSource value type. Used to define source types in DataSource.

BATCH_BIGQUERY = 2
BATCH_FILE = 1
STREAM_KAFKA = 3
STREAM_KINESIS = 4
UNKNOWN = 0

feast.driver_test_data module

class feast.driver_test_data.EventTimestampType(value)[source]

Bases: enum.Enum

An enumeration.

TZ_AWARE_FIXED_OFFSET = 2
TZ_AWARE_US_PACIFIC = 3
TZ_AWARE_UTC = 1
TZ_NAIVE = 0
feast.driver_test_data.create_customer_daily_profile_df(customers, start_date, end_date) pandas.core.frame.DataFrame[source]

Example df generated by this function:

event_timestamp | customer_id | current_balance | avg_passenger_count | lifetime_trip_count | created |

|------------------+-------------+-----------------+---------------------+---------------------+------------------| | 2021-03-17 19:31 | 1010 | 0.889188 | 0.049057 | 412 | 2021-03-24 19:38 | | 2021-03-18 19:31 | 1010 | 0.979273 | 0.212630 | 639 | 2021-03-24 19:38 | | 2021-03-19 19:31 | 1010 | 0.976549 | 0.176881 | 70 | 2021-03-24 19:38 | | 2021-03-20 19:31 | 1010 | 0.273697 | 0.325012 | 68 | 2021-03-24 19:38 | | 2021-03-21 19:31 | 1010 | 0.438262 | 0.313009 | 192 | 2021-03-24 19:38 | | | … | … | … | … | | | 2021-03-19 19:31 | 1001 | 0.738860 | 0.857422 | 344 | 2021-03-24 19:38 | | 2021-03-20 19:31 | 1001 | 0.848397 | 0.745989 | 106 | 2021-03-24 19:38 | | 2021-03-21 19:31 | 1001 | 0.301552 | 0.185873 | 812 | 2021-03-24 19:38 | | 2021-03-22 19:31 | 1001 | 0.943030 | 0.561219 | 322 | 2021-03-24 19:38 | | 2021-03-23 19:31 | 1001 | 0.354919 | 0.810093 | 273 | 2021-03-24 19:38 |

feast.driver_test_data.create_driver_hourly_stats_df(drivers, start_date, end_date) pandas.core.frame.DataFrame[source]

Example df generated by this function:

event_timestamp | driver_id | conv_rate | acc_rate | avg_daily_trips | created |

|------------------+-----------+-----------+----------+-----------------+------------------| | 2021-03-17 19:31 | 5010 | 0.229297 | 0.685843 | 861 | 2021-03-24 19:34 | | 2021-03-17 20:31 | 5010 | 0.781655 | 0.861280 | 769 | 2021-03-24 19:34 | | 2021-03-17 21:31 | 5010 | 0.150333 | 0.525581 | 778 | 2021-03-24 19:34 | | 2021-03-17 22:31 | 5010 | 0.951701 | 0.228883 | 570 | 2021-03-24 19:34 | | 2021-03-17 23:31 | 5010 | 0.819598 | 0.262503 | 473 | 2021-03-24 19:34 | | | … | … | … | … | | | 2021-03-24 16:31 | 5001 | 0.061585 | 0.658140 | 477 | 2021-03-24 19:34 | | 2021-03-24 17:31 | 5001 | 0.088949 | 0.303897 | 618 | 2021-03-24 19:34 | | 2021-03-24 18:31 | 5001 | 0.096652 | 0.747421 | 480 | 2021-03-24 19:34 | | 2021-03-17 19:31 | 5005 | 0.142936 | 0.707596 | 466 | 2021-03-24 19:34 | | 2021-03-17 19:31 | 5005 | 0.142936 | 0.707596 | 466 | 2021-03-24 19:34 |

feast.driver_test_data.create_field_mapping_df(start_date, end_date) pandas.core.frame.DataFrame[source]

Example df generated by this function: | event_timestamp | column_name | created | |------------------+-------------+------------------| | 2021-03-17 19:00 | 99 | 2021-03-24 19:38 | | 2021-03-17 19:00 | 22 | 2021-03-24 19:38 | | 2021-03-17 19:00 | 7 | 2021-03-24 19:38 | | 2021-03-17 19:00 | 45 | 2021-03-24 19:38 |

feast.driver_test_data.create_global_daily_stats_df(start_date, end_date) pandas.core.frame.DataFrame[source]

Example df generated by this function:

event_timestamp | num_rides | avg_ride_length | created |

|------------------+-------------+-----------------+------------------| | 2021-03-17 19:00 | 99 | 0.889188 | 2021-03-24 19:38 | | 2021-03-18 19:00 | 52 | 0.979273 | 2021-03-24 19:38 | | 2021-03-19 19:00 | 66 | 0.976549 | 2021-03-24 19:38 | | 2021-03-20 19:00 | 84 | 0.273697 | 2021-03-24 19:38 | | 2021-03-21 19:00 | 89 | 0.438262 | 2021-03-24 19:38 | | | … | … | | | 2021-03-24 19:00 | 54 | 0.738860 | 2021-03-24 19:38 | | 2021-03-25 19:00 | 58 | 0.848397 | 2021-03-24 19:38 | | 2021-03-26 19:00 | 69 | 0.301552 | 2021-03-24 19:38 | | 2021-03-27 19:00 | 63 | 0.943030 | 2021-03-24 19:38 | | 2021-03-28 19:00 | 79 | 0.354919 | 2021-03-24 19:38 |

feast.driver_test_data.create_location_stats_df(locations, start_date, end_date) pandas.core.frame.DataFrame[source]

Example df generated by this function:

event_timestamp | location_id | temperature | created |
feast.driver_test_data.create_orders_df(customers, drivers, start_date, end_date, order_count, locations=None) pandas.core.frame.DataFrame[source]

Example df generated by this function (if locations):

order_id | driver_id | customer_id | origin_id | destination_id | order_is_success | event_timestamp |

feast.entity module

class feast.entity.Entity(name: str, value_type: feast.value_type.ValueType = ValueType.UNKNOWN, description: str = '', join_key: Optional[str] = None, tags: Dict[str, str] = None, labels: Optional[Dict[str, str]] = None, owner: str = '')[source]

Bases: object

An entity defines a collection of entities for which features can be defined. An entity can also contain associated metadata.

name

The unique name of the entity.

Type

str

value_type

The type of the entity, such as string or float.

Type

feast.value_type.ValueType

join_key

A property that uniquely identifies different entities within the collection. The join_key property is typically used for joining entities with their associated features. If not specified, defaults to the name.

Type

str

description

A human-readable description.

Type

str

tags

A dictionary of key-value pairs to store arbitrary metadata.

Type

Dict[str, str]

owner

The owner of the feature service, typically the email of the primary maintainer.

Type

str

created_timestamp

The time when the entity was created.

Type

Optional[datetime.datetime]

last_updated_timestamp

The time when the entity was last updated.

Type

Optional[datetime.datetime]

created_timestamp: Optional[datetime.datetime]
description: str
classmethod from_proto(entity_proto: feast.core.Entity_pb2.Entity)[source]

Creates an entity from a protobuf representation of an entity.

Parameters

entity_proto – A protobuf representation of an entity.

Returns

An Entity object based on the entity protobuf.

is_valid()[source]

Validates the state of this entity locally.

Raises

ValueError – The entity does not have a name or does not have a type.

join_key: str
last_updated_timestamp: Optional[datetime.datetime]
name: str
owner: str
tags: Dict[str, str]
to_proto() feast.core.Entity_pb2.Entity[source]

Converts an entity object to its protobuf representation.

Returns

An EntityProto protobuf.

value_type: feast.value_type.ValueType

feast.errors module

exception feast.errors.AwsAPIGatewayDoesNotExist(resource_name: str)[source]

Bases: Exception

exception feast.errors.AwsLambdaDoesNotExist(resource_name: str)[source]

Bases: Exception

exception feast.errors.BigQueryJobCancelled(job_id)[source]

Bases: Exception

exception feast.errors.BigQueryJobStillRunning(job_id)[source]

Bases: Exception

exception feast.errors.ConflictingFeatureViewNames(feature_view_name: str)[source]

Bases: Exception

exception feast.errors.DataSourceNoNameException[source]

Bases: Exception

exception feast.errors.DataSourceNotFoundException(path)[source]

Bases: Exception

exception feast.errors.DataSourceObjectNotFoundException(name, project=None)[source]

Bases: feast.errors.FeastObjectNotFoundException

exception feast.errors.DockerDaemonNotRunning[source]

Bases: Exception

exception feast.errors.EntityNotFoundException(name, project=None)[source]

Bases: feast.errors.FeastObjectNotFoundException

exception feast.errors.EntityTimestampInferenceException(expected_column_name: str)[source]

Bases: Exception

exception feast.errors.ExperimentalFeatureNotEnabled(feature_flag_name: str)[source]

Bases: Exception

exception feast.errors.FeastClassImportError(module_name: str, class_name: str)[source]

Bases: Exception

exception feast.errors.FeastEntityDFMissingColumnsError(expected, missing)[source]

Bases: Exception

exception feast.errors.FeastExtrasDependencyImportError(extras_type: str, nested_error: str)[source]

Bases: Exception

exception feast.errors.FeastFeatureServerTypeInvalidError(feature_server_type: str)[source]

Bases: Exception

exception feast.errors.FeastFeatureServerTypeSetError(feature_server_type: str)[source]

Bases: Exception

exception feast.errors.FeastInvalidBaseClass(class_name: str, class_type: str)[source]

Bases: Exception

exception feast.errors.FeastInvalidInfraObjectType[source]

Bases: Exception

exception feast.errors.FeastJoinKeysDuringMaterialization(source: str, join_key_columns: Set[str], source_columns: Set[str])[source]

Bases: Exception

exception feast.errors.FeastModuleImportError(module_name: str, class_name: str)[source]

Bases: Exception

exception feast.errors.FeastObjectNotFoundException[source]

Bases: Exception

exception feast.errors.FeastOfflineStoreUnsupportedDataSource(offline_store_name: str, data_source_name: str)[source]

Bases: Exception

exception feast.errors.FeastOnlineStoreInvalidName(online_store_class_name: str)[source]

Bases: Exception

exception feast.errors.FeastOnlineStoreUnsupportedDataSource(online_store_name: str, data_source_name: str)[source]

Bases: Exception

exception feast.errors.FeastProviderLoginError[source]

Bases: Exception

Error class that indicates a user has not authenticated with their provider.

exception feast.errors.FeastProviderNotImplementedError(provider_name)[source]

Bases: Exception

exception feast.errors.FeastProviderNotSetError[source]

Bases: Exception

exception feast.errors.FeatureNameCollisionError(feature_refs_collisions: List[str], full_feature_names: bool)[source]

Bases: Exception

exception feast.errors.FeatureServiceNotFoundException(name, project=None)[source]

Bases: feast.errors.FeastObjectNotFoundException

exception feast.errors.FeatureViewNotFoundException(name, project=None)[source]

Bases: feast.errors.FeastObjectNotFoundException

exception feast.errors.IncompatibleRegistryStoreClass(actual_class: str, expected_class: str)[source]

Bases: Exception

exception feast.errors.InvalidEntityType(entity_type: type)[source]

Bases: Exception

exception feast.errors.OnDemandFeatureViewNotFoundException(name, project=None)[source]

Bases: feast.errors.FeastObjectNotFoundException

exception feast.errors.RedshiftCredentialsError[source]

Bases: Exception

exception feast.errors.RedshiftQueryError(details)[source]

Bases: Exception

exception feast.errors.RedshiftTableNameTooLong(table_name: str)[source]

Bases: Exception

exception feast.errors.RegistryInferenceFailure(repo_obj_type: str, specific_issue: str)[source]

Bases: Exception

exception feast.errors.RepoConfigPathDoesNotExist[source]

Bases: Exception

exception feast.errors.RequestDataNotFoundInEntityDfException(feature_name, feature_view_name)[source]

Bases: feast.errors.FeastObjectNotFoundException

exception feast.errors.RequestDataNotFoundInEntityRowsException(feature_names)[source]

Bases: feast.errors.FeastObjectNotFoundException

exception feast.errors.S3RegistryBucketForbiddenAccess(bucket)[source]

Bases: feast.errors.FeastObjectNotFoundException

exception feast.errors.S3RegistryBucketNotExist(bucket)[source]

Bases: feast.errors.FeastObjectNotFoundException

exception feast.errors.SavedDatasetNotFound(name: str, project: str)[source]

Bases: feast.errors.FeastObjectNotFoundException

exception feast.errors.SnowflakeCredentialsError[source]

Bases: Exception

exception feast.errors.SnowflakeIncompleteConfig(e: KeyError)[source]

Bases: Exception

exception feast.errors.SnowflakeQueryError(details)[source]

Bases: Exception

exception feast.errors.SnowflakeQueryUnknownError(query: str)[source]

Bases: Exception

exception feast.errors.SpecifiedFeaturesNotPresentError(specified_features: List[str], feature_view_name: str)[source]

Bases: Exception

feast.feature module

class feast.feature.Feature(name: str, dtype: feast.value_type.ValueType, labels: Optional[Dict[str, str]] = None)[source]

Bases: object

A Feature represents a class of serveable feature.

Parameters
  • name – Name of the feature.

  • dtype – The type of the feature, such as string or float.

  • labels (optional) – User-defined metadata in dictionary form.

property dtype: feast.value_type.ValueType

Gets the data type of this feature.

classmethod from_proto(feature_proto: feast.core.Feature_pb2.FeatureSpecV2)[source]
Parameters

feature_proto – FeatureSpecV2 protobuf object

Returns

Feature object

property labels: Dict[str, str]

Gets the labels of this feature.

property name

Gets the name of this feature.

to_proto() feast.core.Feature_pb2.FeatureSpecV2[source]

Converts Feature object to its Protocol Buffer representation.

Returns

A FeatureSpecProto protobuf.

feast.feature_store module

class feast.feature_store.FeatureStore(repo_path: Optional[str] = None, config: Optional[feast.repo_config.RepoConfig] = None)[source]

Bases: object

A FeatureStore object is used to define, create, and retrieve features.

Parameters
  • repo_path (optional) – Path to a feature_store.yaml used to configure the feature store.

  • config (optional) – Configuration object used to configure the feature store.

apply(objects: Union[feast.data_source.DataSource, feast.entity.Entity, feast.feature_view.FeatureView, feast.on_demand_feature_view.OnDemandFeatureView, feast.request_feature_view.RequestFeatureView, feast.feature_service.FeatureService, List[Union[feast.feature_view.FeatureView, feast.on_demand_feature_view.OnDemandFeatureView, feast.request_feature_view.RequestFeatureView, feast.entity.Entity, feast.feature_service.FeatureService, feast.data_source.DataSource]]], objects_to_delete: Optional[List[Union[feast.feature_view.FeatureView, feast.on_demand_feature_view.OnDemandFeatureView, feast.request_feature_view.RequestFeatureView, feast.entity.Entity, feast.feature_service.FeatureService, feast.data_source.DataSource]]] = None, partial: bool = True)[source]

Register objects to metadata store and update related infrastructure.

The apply method registers one or more definitions (e.g., Entity, FeatureView) and registers or updates these objects in the Feast registry. Once the apply method has updated the infrastructure (e.g., create tables in an online store), it will commit the updated registry. All operations are idempotent, meaning they can safely be rerun.

Parameters
  • objects – A single object, or a list of objects that should be registered with the Feature Store.

  • objects_to_delete – A list of objects to be deleted from the registry and removed from the provider’s infrastructure. This deletion will only be performed if partial is set to False.

  • partial – If True, apply will only handle the specified objects; if False, apply will also delete all the objects in objects_to_delete, and tear down any associated cloud resources.

Raises

ValueError – The ‘objects’ parameter could not be parsed properly.

Examples

Register an Entity and a FeatureView.

>>> from feast import FeatureStore, Entity, FeatureView, Feature, ValueType, FileSource, RepoConfig
>>> from datetime import timedelta
>>> fs = FeatureStore(repo_path="feature_repo")
>>> driver = Entity(name="driver_id", value_type=ValueType.INT64, description="driver id")
>>> driver_hourly_stats = FileSource(
...     path="feature_repo/data/driver_stats.parquet",
...     event_timestamp_column="event_timestamp",
...     created_timestamp_column="created",
... )
>>> driver_hourly_stats_view = FeatureView(
...     name="driver_hourly_stats",
...     entities=["driver_id"],
...     ttl=timedelta(seconds=86400 * 1),
...     batch_source=driver_hourly_stats,
... )
>>> fs.apply([driver_hourly_stats_view, driver]) # register entity and feature view
config: feast.repo_config.RepoConfig
create_saved_dataset(from_: feast.infra.offline_stores.offline_store.RetrievalJob, name: str, storage: feast.saved_dataset.SavedDatasetStorage, tags: Optional[Dict[str, str]] = None, feature_service: Optional[feast.feature_service.FeatureService] = None, profiler: Optional[feast.dqm.profilers.ge_profiler.GEProfiler] = None) feast.saved_dataset.SavedDataset[source]

Execute provided retrieval job and persist its outcome in given storage. Storage type (eg, BigQuery or Redshift) must be the same as globally configured offline store. After data successfully persisted saved dataset object with dataset metadata is committed to the registry. Name for the saved dataset should be unique within project, since it’s possible to overwrite previously stored dataset with the same name.

Returns

SavedDataset object with attached RetrievalJob

Raises

ValueError if given retrieval job doesn't have metadata

delete_feature_service(name: str)[source]

Deletes a feature service.

Parameters

name – Name of feature service.

Raises

FeatureServiceNotFoundException – The feature view could not be found.

delete_feature_view(name: str)[source]

Deletes a feature view.

Parameters

name – Name of feature view.

Raises

FeatureViewNotFoundException – The feature view could not be found.

static ensure_request_data_values_exist(needed_request_data: Set[str], needed_request_fv_features: Set[str], request_data_features: Dict[str, List[Any]])[source]
get_data_source(name: str) feast.data_source.DataSource[source]

Retrieves the list of data sources from the registry.

Parameters

name – Name of the data source.

Returns

The specified data source.

Raises

DataSourceObjectNotFoundException – The data source could not be found.

get_entity(name: str) feast.entity.Entity[source]

Retrieves an entity.

Parameters

name – Name of entity.

Returns

The specified entity.

Raises

EntityNotFoundException – The entity could not be found.

get_feature_server_endpoint() Optional[str][source]

Returns endpoint for the feature server, if it exists.

get_feature_service(name: str, allow_cache: bool = False) feast.feature_service.FeatureService[source]

Retrieves a feature service.

Parameters
  • name – Name of feature service.

  • allow_cache – Whether to allow returning feature services from a cached registry.

Returns

The specified feature service.

Raises

FeatureServiceNotFoundException – The feature service could not be found.

get_feature_view(name: str) feast.feature_view.FeatureView[source]

Retrieves a feature view.

Parameters

name – Name of feature view.

Returns

The specified feature view.

Raises

FeatureViewNotFoundException – The feature view could not be found.

get_historical_features(entity_df: Union[pandas.core.frame.DataFrame, str], features: Union[List[str], feast.feature_service.FeatureService], full_feature_names: bool = False) feast.infra.offline_stores.offline_store.RetrievalJob[source]

Enrich an entity dataframe with historical feature values for either training or batch scoring.

This method joins historical feature data from one or more feature views to an entity dataframe by using a time travel join.

Each feature view is joined to the entity dataframe using all entities configured for the respective feature view. All configured entities must be available in the entity dataframe. Therefore, the entity dataframe must contain all entities found in all feature views, but the individual feature views can have different entities.

Time travel is based on the configured TTL for each feature view. A shorter TTL will limit the amount of scanning that will be done in order to find feature data for a specific entity key. Setting a short TTL may result in null values being returned.

Parameters
  • entity_df (Union[pd.DataFrame, str]) – An entity dataframe is a collection of rows containing all entity columns (e.g., customer_id, driver_id) on which features need to be joined, as well as a event_timestamp column used to ensure point-in-time correctness. Either a Pandas DataFrame can be provided or a string SQL query. The query must be of a format supported by the configured offline store (e.g., BigQuery)

  • features – A list of features, that should be retrieved from the offline store. Either a list of string feature references can be provided or a FeatureService object. Feature references are of the format “feature_view:feature”, e.g., “customer_fv:daily_transactions”.

  • full_feature_names – A boolean that provides the option to add the feature view prefixes to the feature names, changing them from the format “feature” to “feature_view__feature” (e.g., “daily_transactions” changes to “customer_fv__daily_transactions”). By default, this value is set to False.

Returns

RetrievalJob which can be used to materialize the results.

Raises

ValueError – Both or neither of features and feature_refs are specified.

Examples

Retrieve historical features from a local offline store.

>>> from feast import FeatureStore, RepoConfig
>>> import pandas as pd
>>> fs = FeatureStore(repo_path="feature_repo")
>>> entity_df = pd.DataFrame.from_dict(
...     {
...         "driver_id": [1001, 1002],
...         "event_timestamp": [
...             datetime(2021, 4, 12, 10, 59, 42),
...             datetime(2021, 4, 12, 8, 12, 10),
...         ],
...     }
... )
>>> retrieval_job = fs.get_historical_features(
...     entity_df=entity_df,
...     features=[
...         "driver_hourly_stats:conv_rate",
...         "driver_hourly_stats:acc_rate",
...         "driver_hourly_stats:avg_daily_trips",
...     ],
... )
>>> feature_data = retrieval_job.to_df()
static get_needed_request_data(grouped_odfv_refs: List[Tuple[feast.on_demand_feature_view.OnDemandFeatureView, List[str]]], grouped_request_fv_refs: List[Tuple[feast.request_feature_view.RequestFeatureView, List[str]]]) Tuple[Set[str], Set[str]][source]
get_on_demand_feature_view(name: str) feast.on_demand_feature_view.OnDemandFeatureView[source]

Retrieves a feature view.

Parameters

name – Name of feature view.

Returns

The specified feature view.

Raises

FeatureViewNotFoundException – The feature view could not be found.

get_online_features(features: Union[List[str], feast.feature_service.FeatureService], entity_rows: List[Dict[str, Any]], full_feature_names: bool = False) feast.online_response.OnlineResponse[source]

Retrieves the latest online feature data.

Note: This method will download the full feature registry the first time it is run. If you are using a remote registry like GCS or S3 then that may take a few seconds. The registry remains cached up to a TTL duration (which can be set to infinity). If the cached registry is stale (more time than the TTL has passed), then a new registry will be downloaded synchronously by this method. This download may introduce latency to online feature retrieval. In order to avoid synchronous downloads, please call refresh_registry() prior to the TTL being reached. Remember it is possible to set the cache TTL to infinity (cache forever).

Parameters
  • features – List of feature references that will be returned for each entity. Each feature reference should have the following format: “feature_view:feature” where “feature_view” & “feature” refer to the Feature and FeatureView names respectively. Only the feature name is required.

  • entity_rows – A list of dictionaries where each key-value is an entity-name, entity-value pair.

Returns

OnlineResponse containing the feature data in records.

Raises

Exception – No entity with the specified name exists.

Examples

Materialize all features into the online store over the interval from 3 hours ago to 10 minutes ago, and then retrieve these online features.

>>> from feast import FeatureStore, RepoConfig
>>> fs = FeatureStore(repo_path="feature_repo")
>>> online_response = fs.get_online_features(
...     features=[
...         "driver_hourly_stats:conv_rate",
...         "driver_hourly_stats:acc_rate",
...         "driver_hourly_stats:avg_daily_trips",
...     ],
...     entity_rows=[{"driver_id": 1001}, {"driver_id": 1002}, {"driver_id": 1003}, {"driver_id": 1004}],
... )
>>> online_response_dict = online_response.to_dict()
get_saved_dataset(name: str) feast.saved_dataset.SavedDataset[source]

Find a saved dataset in the registry by provided name and create a retrieval job to pull whole dataset from storage (offline store).

If dataset couldn’t be found by provided name SavedDatasetNotFound exception will be raised.

Data will be retrieved from globally configured offline store.

Returns

SavedDataset with RetrievalJob attached

Raises

SavedDatasetNotFound

list_data_sources(allow_cache: bool = False) List[feast.data_source.DataSource][source]

Retrieves the list of data sources from the registry.

Parameters

allow_cache – Whether to allow returning data sources from a cached registry.

Returns

A list of data sources.

list_entities(allow_cache: bool = False) List[feast.entity.Entity][source]

Retrieves the list of entities from the registry.

Parameters

allow_cache – Whether to allow returning entities from a cached registry.

Returns

A list of entities.

list_feature_services() List[feast.feature_service.FeatureService][source]

Retrieves the list of feature services from the registry.

Returns

A list of feature services.

list_feature_views(allow_cache: bool = False) List[feast.feature_view.FeatureView][source]

Retrieves the list of feature views from the registry.

Parameters

allow_cache – Whether to allow returning entities from a cached registry.

Returns

A list of feature views.

list_on_demand_feature_views(allow_cache: bool = False) List[feast.on_demand_feature_view.OnDemandFeatureView][source]

Retrieves the list of on demand feature views from the registry.

Returns

A list of on demand feature views.

list_request_feature_views(allow_cache: bool = False) List[feast.request_feature_view.RequestFeatureView][source]

Retrieves the list of feature views from the registry.

Parameters

allow_cache – Whether to allow returning entities from a cached registry.

Returns

A list of feature views.

materialize(start_date: datetime.datetime, end_date: datetime.datetime, feature_views: Optional[List[str]] = None) None[source]

Materialize data from the offline store into the online store.

This method loads feature data in the specified interval from either the specified feature views, or all feature views if none are specified, into the online store where it is available for online serving.

Parameters
  • start_date (datetime) – Start date for time range of data to materialize into the online store

  • end_date (datetime) – End date for time range of data to materialize into the online store

  • feature_views (List[str]) – Optional list of feature view names. If selected, will only run materialization for the specified feature views.

Examples

Materialize all features into the online store over the interval from 3 hours ago to 10 minutes ago.

>>> from feast import FeatureStore, RepoConfig
>>> from datetime import datetime, timedelta
>>> fs = FeatureStore(repo_path="feature_repo")
>>> fs.materialize(
...     start_date=datetime.utcnow() - timedelta(hours=3), end_date=datetime.utcnow() - timedelta(minutes=10)
... )
Materializing...

...
materialize_incremental(end_date: datetime.datetime, feature_views: Optional[List[str]] = None) None[source]

Materialize incremental new data from the offline store into the online store.

This method loads incremental new feature data up to the specified end time from either the specified feature views, or all feature views if none are specified, into the online store where it is available for online serving. The start time of the interval materialized is either the most recent end time of a prior materialization or (now - ttl) if no such prior materialization exists.

Parameters
  • end_date (datetime) – End date for time range of data to materialize into the online store

  • feature_views (List[str]) – Optional list of feature view names. If selected, will only run materialization for the specified feature views.

Raises

Exception – A feature view being materialized does not have a TTL set.

Examples

Materialize all features into the online store up to 5 minutes ago.

>>> from feast import FeatureStore, RepoConfig
>>> from datetime import datetime, timedelta
>>> fs = FeatureStore(repo_path="feature_repo")
>>> fs.materialize_incremental(end_date=datetime.utcnow() - timedelta(minutes=5))
Materializing...

...
property project: str

Gets the project of this feature store.

refresh_registry()[source]

Fetches and caches a copy of the feature registry in memory.

Explicitly calling this method allows for direct control of the state of the registry cache. Every time this method is called the complete registry state will be retrieved from the remote registry store backend (e.g., GCS, S3), and the cache timer will be reset. If refresh_registry() is run before get_online_features() is called, then get_online_feature() will use the cached registry instead of retrieving (and caching) the registry itself.

Additionally, the TTL for the registry cache can be set to infinity (by setting it to 0), which means that refresh_registry() will become the only way to update the cached registry. If the TTL is set to a value greater than 0, then once the cache becomes stale (more time than the TTL has passed), a new cache will be downloaded synchronously, which may increase latencies if the triggering method is get_online_features()

property registry: feast.registry.Registry

Gets the registry of this feature store.

repo_path: pathlib.Path
serve(host: str, port: int, no_access_log: bool) None[source]

Start the feature consumption server locally on a given port.

serve_transformations(port: int) None[source]

Start the feature transformation server locally on a given port.

teardown()[source]

Tears down all local and cloud resources for the feature store.

version() str[source]

Returns the version of the current Feast SDK/CLI.

write_to_online_store(feature_view_name: str, df: pandas.core.frame.DataFrame, allow_registry_cache: bool = True)[source]

ingests data directly into the Online store

feast.feature_table module

feast.feature_view module

class feast.feature_view.FeatureView(name: str, entities: List[str], ttl: Union[google.protobuf.duration_pb2.Duration, datetime.timedelta], input: Optional[feast.data_source.DataSource] = None, batch_source: Optional[feast.data_source.DataSource] = None, stream_source: Optional[feast.data_source.DataSource] = None, features: Optional[List[feast.feature.Feature]] = None, tags: Optional[Dict[str, str]] = None, online: bool = True)[source]

Bases: feast.base_feature_view.BaseFeatureView

A FeatureView defines a logical grouping of serveable features.

Parameters
  • name – Name of the group of features.

  • entities – The entities to which this group of features is associated.

  • ttl – The amount of time this group of features lives. A ttl of 0 indicates that this group of features lives forever. Note that large ttl’s or a ttl of 0 can result in extremely computationally intensive queries.

  • input – The source of data where this group of features is stored.

  • batch_source (optional) – The batch source of data where this group of features is stored.

  • stream_source (optional) – The stream source of data where this group of features is stored.

  • features (optional) – The set of features defined as part of this FeatureView.

  • tags (optional) – A dictionary of key-value pairs used for organizing FeatureViews.

batch_source: feast.data_source.DataSource
ensure_valid()[source]

Validates the state of this feature view locally.

Raises

ValueError – The feature view does not have a name or does not have entities.

entities: List[str]
classmethod from_proto(feature_view_proto: feast.core.FeatureView_pb2.FeatureView)[source]

Creates a feature view from a protobuf representation of a feature view.

Parameters

feature_view_proto – A protobuf representation of a feature view.

Returns

A FeatureViewProto object based on the feature view protobuf.

input: feast.data_source.DataSource
materialization_intervals: List[Tuple[datetime.datetime, datetime.datetime]]
property most_recent_end_time: Optional[datetime.datetime]

Retrieves the latest time up to which the feature view has been materialized.

Returns

The latest time, or None if the feature view has not been materialized.

online: bool
property proto_class: Type[feast.core.FeatureView_pb2.FeatureView]
stream_source: Optional[feast.data_source.DataSource]
tags: Optional[Dict[str, str]]
to_proto() feast.core.FeatureView_pb2.FeatureView[source]

Converts a feature view object to its protobuf representation.

Returns

A FeatureViewProto protobuf.

ttl: datetime.timedelta
with_join_key_map(join_key_map: Dict[str, str])[source]

Sets the join_key_map by returning a copy of this feature view with that field set. This join_key mapping operation is only used as part of query operations and will not modify the underlying FeatureView.

Parameters

join_key_map – A map of join keys in which the left is the join_key that corresponds with the feature data and the right corresponds with the entity data.

Returns

A copy of this FeatureView with the join_key_map replaced with the ‘join_key_map’ input.

Examples

Join a location feature data table to both the origin column and destination column of the entity data.

temperatures_feature_service = FeatureService(

name=”temperatures”, features=[

location_stats_feature_view

.with_name(“origin_stats”) .with_join_key_map(

{“location_id”: “origin_id”}

),

location_stats_feature_view

.with_name(“destination_stats”) .with_join_key_map(

{“location_id”: “destination_id”}

),

],

)

with_name(name: str)[source]

Renames this feature view by returning a copy of this feature view with an alias set for the feature view name. This rename operation is only used as part of query operations and will not modify the underlying FeatureView.

Parameters

name – Name to assign to the FeatureView copy.

Returns

A copy of this FeatureView with the name replaced with the ‘name’ input.

with_projection(feature_view_projection: feast.feature_view_projection.FeatureViewProjection)[source]

Sets the feature view projection by returning a copy of this feature view with its projection set to the given projection. A projection is an object that stores the modifications to a feature view that is used during query operations.

Parameters

feature_view_projection – The FeatureViewProjection object to link to this OnDemandFeatureView.

Returns

A copy of this FeatureView with its projection replaced with the ‘feature_view_projection’ argument.

feast.names module

feast.online_response module

class feast.online_response.OnlineResponse(online_response_proto: feast.serving.ServingService_pb2.GetOnlineFeaturesResponse)[source]

Bases: object

Defines an online response in feast.

to_df(include_event_timestamps: bool = False) pandas.core.frame.DataFrame[source]

Converts GetOnlineFeaturesResponse features into Panda dataframe form.

Args: is_with_event_timestamps: bool Optionally include feature timestamps in the dataframe

to_dict(include_event_timestamps: bool = False) Dict[str, Any][source]

Converts GetOnlineFeaturesResponse features into a dictionary form.

Args: is_with_event_timestamps: bool Optionally include feature timestamps in the dictionary

feast.registry module

class feast.registry.FeastObjectType(value)[source]

Bases: enum.Enum

An enumeration.

DATA_SOURCE = 'data source'
ENTITY = 'entity'
FEATURE_SERVICE = 'feature service'
FEATURE_VIEW = 'feature view'
ON_DEMAND_FEATURE_VIEW = 'on demand feature view'
REQUEST_FEATURE_VIEW = 'request feature view'
static get_objects_from_registry(registry: feast.registry.Registry, project: str) Dict[feast.registry.FeastObjectType, List[Any]][source]
static get_objects_from_repo_contents(repo_contents: feast.repo_contents.RepoContents) Dict[feast.registry.FeastObjectType, Set[Any]][source]
class feast.registry.Registry(registry_config: Optional[feast.repo_config.RegistryConfig], repo_path: Optional[pathlib.Path])[source]

Bases: object

Registry: A registry allows for the management and persistence of feature definitions and related metadata.

apply_data_source(data_source: feast.data_source.DataSource, project: str, commit: bool = True)[source]

Registers a single data source with Feast

Parameters
  • data_source – A data source that will be registered

  • project – Feast project that this data source belongs to

  • commit – Whether to immediately commit to the registry

apply_entity(entity: feast.entity.Entity, project: str, commit: bool = True)[source]

Registers a single entity with Feast

Parameters
  • entity – Entity that will be registered

  • project – Feast project that this entity belongs to

  • commit – Whether the change should be persisted immediately

apply_feature_service(feature_service: feast.feature_service.FeatureService, project: str, commit: bool = True)[source]

Registers a single feature service with Feast

Parameters
  • feature_service – A feature service that will be registered

  • project – Feast project that this entity belongs to

apply_feature_view(feature_view: feast.base_feature_view.BaseFeatureView, project: str, commit: bool = True)[source]

Registers a single feature view with Feast

Parameters
  • feature_view – Feature view that will be registered

  • project – Feast project that this feature view belongs to

  • commit – Whether the change should be persisted immediately

apply_materialization(feature_view: feast.feature_view.FeatureView, project: str, start_date: datetime.datetime, end_date: datetime.datetime, commit: bool = True)[source]

Updates materialization intervals tracked for a single feature view in Feast

Parameters
  • feature_view – Feature view that will be updated with an additional materialization interval tracked

  • project – Feast project that this feature view belongs to

  • start_date (datetime) – Start date of the materialization interval to track

  • end_date (datetime) – End date of the materialization interval to track

  • commit – Whether the change should be persisted immediately

apply_saved_dataset(saved_dataset: feast.saved_dataset.SavedDataset, project: str, commit: bool = True)[source]

Registers a single entity with Feast

Parameters
  • saved_dataset – SavedDataset that will be added / updated to registry

  • project – Feast project that this dataset belongs to

  • commit – Whether the change should be persisted immediately

cached_registry_proto: Optional[feast.core.Registry_pb2.Registry] = None
cached_registry_proto_created: Optional[datetime.datetime] = None
cached_registry_proto_ttl: datetime.timedelta
clone() feast.registry.Registry[source]
commit()[source]

Commits the state of the registry cache to the remote registry store.

delete_data_source(name: str, project: str, commit: bool = True)[source]

Deletes a data source or raises an exception if not found.

Parameters
  • name – Name of data source

  • project – Feast project that this data source belongs to

  • commit – Whether the change should be persisted immediately

delete_entity(name: str, project: str, commit: bool = True)[source]

Deletes an entity or raises an exception if not found.

Parameters
  • name – Name of entity

  • project – Feast project that this entity belongs to

  • commit – Whether the change should be persisted immediately

delete_feature_service(name: str, project: str, commit: bool = True)[source]

Deletes a feature service or raises an exception if not found.

Parameters
  • name – Name of feature service

  • project – Feast project that this feature service belongs to

  • commit – Whether the change should be persisted immediately

delete_feature_view(name: str, project: str, commit: bool = True)[source]

Deletes a feature view or raises an exception if not found.

Parameters
  • name – Name of feature view

  • project – Feast project that this feature view belongs to

  • commit – Whether the change should be persisted immediately

get_data_source(name: str, project: str, allow_cache: bool = False) feast.data_source.DataSource[source]

Retrieves a data source.

Parameters
  • name – Name of data source

  • project – Feast project that this data source belongs to

  • allow_cache – Whether to allow returning this data source from a cached registry

Returns

Returns either the specified data source, or raises an exception if none is found

get_entity(name: str, project: str, allow_cache: bool = False) feast.entity.Entity[source]

Retrieves an entity.

Parameters
  • name – Name of entity

  • project – Feast project that this entity belongs to

  • allow_cache – Whether to allow returning this entity from a cached registry

Returns

Returns either the specified entity, or raises an exception if none is found

get_feature_service(name: str, project: str, allow_cache: bool = False) feast.feature_service.FeatureService[source]

Retrieves a feature service.

Parameters
  • name – Name of feature service

  • project – Feast project that this feature service belongs to

  • allow_cache – Whether to allow returning this feature service from a cached registry

Returns

Returns either the specified feature service, or raises an exception if none is found

get_feature_view(name: str, project: str, allow_cache: bool = False) feast.feature_view.FeatureView[source]

Retrieves a feature view.

Parameters
  • name – Name of feature view

  • project – Feast project that this feature view belongs to

  • allow_cache – Allow returning feature view from the cached registry

Returns

Returns either the specified feature view, or raises an exception if none is found

get_infra(project: str, allow_cache: bool = False) feast.infra.infra_object.Infra[source]

Retrieves the stored Infra object.

Parameters
  • project – Feast project that the Infra object refers to

  • allow_cache – Whether to allow returning this entity from a cached registry

Returns

The stored Infra object.

get_on_demand_feature_view(name: str, project: str, allow_cache: bool = False) feast.on_demand_feature_view.OnDemandFeatureView[source]

Retrieves an on demand feature view.

Parameters
  • name – Name of on demand feature view

  • project – Feast project that this on demand feature view belongs to

  • allow_cache – Whether to allow returning this on demand feature view from a cached registry

Returns

Returns either the specified on demand feature view, or raises an exception if none is found

get_saved_dataset(name: str, project: str, allow_cache: bool = False) feast.saved_dataset.SavedDataset[source]

Retrieves a saved dataset.

Parameters
  • name – Name of dataset

  • project – Feast project that this dataset belongs to

  • allow_cache – Whether to allow returning this dataset from a cached registry

Returns

Returns either the specified SavedDataset, or raises an exception if none is found

list_data_sources(project: str, allow_cache: bool = False) List[feast.data_source.DataSource][source]

Retrieve a list of data sources from the registry

Parameters
  • project – Filter data source based on project name

  • allow_cache – Whether to allow returning data sources from a cached registry

Returns

List of data sources

list_entities(project: str, allow_cache: bool = False) List[feast.entity.Entity][source]

Retrieve a list of entities from the registry

Parameters
  • allow_cache – Whether to allow returning entities from a cached registry

  • project – Filter entities based on project name

Returns

List of entities

list_feature_services(project: str, allow_cache: bool = False) List[feast.feature_service.FeatureService][source]

Retrieve a list of feature services from the registry

Parameters
  • allow_cache – Whether to allow returning entities from a cached registry

  • project – Filter entities based on project name

Returns

List of feature services

list_feature_views(project: str, allow_cache: bool = False) List[feast.feature_view.FeatureView][source]

Retrieve a list of feature views from the registry

Parameters
  • allow_cache – Allow returning feature views from the cached registry

  • project – Filter feature views based on project name

Returns

List of feature views

list_on_demand_feature_views(project: str, allow_cache: bool = False) List[feast.on_demand_feature_view.OnDemandFeatureView][source]

Retrieve a list of on demand feature views from the registry

Parameters
  • project – Filter on demand feature views based on project name

  • allow_cache – Whether to allow returning on demand feature views from a cached registry

Returns

List of on demand feature views

list_request_feature_views(project: str, allow_cache: bool = False) List[feast.request_feature_view.RequestFeatureView][source]

Retrieve a list of request feature views from the registry

Parameters
  • allow_cache – Allow returning feature views from the cached registry

  • project – Filter feature views based on project name

Returns

List of feature views

list_saved_datasets(project: str, allow_cache: bool = False) List[feast.saved_dataset.SavedDataset][source]

Retrieves a list of all saved datasets in specified project

Parameters
  • project – Feast project

  • allow_cache – Whether to allow returning this dataset from a cached registry

Returns

Returns the list of SavedDatasets

refresh()[source]

Refreshes the state of the registry cache by fetching the registry state from the remote registry store.

teardown()[source]

Tears down (removes) the registry.

to_dict(project: str) Dict[str, List[Any]][source]

Returns a dictionary representation of the registry contents for the specified project.

For each list in the dictionary, the elements are sorted by name, so this method can be used to compare two registries.

Parameters

project – Feast project to convert to a dict

update_infra(infra: feast.infra.infra_object.Infra, project: str, commit: bool = True)[source]

Updates the stored Infra object.

Parameters
  • infra – The new Infra object to be stored.

  • project – Feast project that the Infra object refers to

  • commit – Whether the change should be persisted immediately

feast.registry.get_registry_store_class_from_scheme(registry_path: str)[source]
feast.registry.get_registry_store_class_from_type(registry_store_type: str)[source]

feast.repo_config module

class feast.repo_config.FeastBaseModel(**extra_data: Any)[source]

Bases: pydantic.main.BaseModel

Feast Pydantic Configuration Class

class Config[source]

Bases: object

arbitrary_types_allowed = True
extra = 'allow'
class feast.repo_config.FeastConfigBaseModel[source]

Bases: pydantic.main.BaseModel

Feast Pydantic Configuration Class

class Config[source]

Bases: object

arbitrary_types_allowed = True
extra = 'forbid'
exception feast.repo_config.FeastConfigError(error_message, config_path)[source]

Bases: Exception

class feast.repo_config.RegistryConfig(*, registry_store_type: pydantic.types.StrictStr = None, path: pydantic.types.StrictStr, cache_ttl_seconds: pydantic.types.StrictInt = 600, **extra_data: Any)[source]

Bases: feast.repo_config.FeastBaseModel

Metadata Store Configuration. Configuration that relates to reading from and writing to the Feast registry.

cache_ttl_seconds: pydantic.types.StrictInt

The cache TTL is the amount of time registry state will be cached in memory. If this TTL is exceeded then the registry will be refreshed when any feature store method asks for access to registry state. The TTL can be set to infinity by setting TTL to 0 seconds, which means the cache will only be loaded once and will never expire. Users can manually refresh the cache by calling feature_store.refresh_registry()

Type

int

path: pydantic.types.StrictStr

Path to metadata store. Can be a local path, or remote object storage path, e.g. a GCS URI

Type

str

registry_store_type: Optional[pydantic.types.StrictStr]

Provider name or a class name that implements RegistryStore.

Type

str

class feast.repo_config.RepoConfig(*, registry: Union[pydantic.types.StrictStr, feast.repo_config.RegistryConfig] = 'data/registry.db', project: pydantic.types.StrictStr, provider: pydantic.types.StrictStr, online_store: Any = None, offline_store: Any = None, feature_server: Any = None, flags: Any = None, repo_path: pathlib.Path = None, **data: Any)[source]

Bases: feast.repo_config.FeastBaseModel

Repo config. Typically loaded from feature_store.yaml

feature_server: Optional[Any]

Feature server configuration (optional depending on provider)

Type

FeatureServerConfig

flags: Any

Feature flags for experimental features (optional)

Type

Flags

get_registry_config()[source]
offline_store: Any

Offline store configuration (optional depending on provider)

Type

OfflineStoreConfig

online_store: Any

Online store configuration (optional depending on provider)

Type

OnlineStoreConfig

project: pydantic.types.StrictStr

Feast project id. This can be any alphanumeric string up to 16 characters. You can have multiple independent feature repositories deployed to the same cloud provider account, as long as they have different project ids.

Type

str

provider: pydantic.types.StrictStr

local or gcp or aws

Type

str

registry: Union[pydantic.types.StrictStr, feast.repo_config.RegistryConfig]

Path to metadata store. Can be a local path, or remote object storage path, e.g. a GCS URI

Type

str

repo_path: Optional[pathlib.Path]
write_to_path(repo_path: pathlib.Path)[source]
feast.repo_config.get_data_source_class_from_type(data_source_type: str)[source]
feast.repo_config.get_feature_server_config_from_type(feature_server_type: str)[source]
feast.repo_config.get_offline_config_from_type(offline_store_type: str)[source]
feast.repo_config.get_online_config_from_type(online_store_type: str)[source]
feast.repo_config.load_repo_config(repo_path: pathlib.Path) feast.repo_config.RepoConfig[source]

feast.repo_operations module

feast.repo_operations.apply_total(repo_config: feast.repo_config.RepoConfig, repo_path: pathlib.Path, skip_source_validation: bool)[source]
feast.repo_operations.apply_total_with_repo_instance(store: feast.feature_store.FeatureStore, project: str, registry: feast.registry.Registry, repo: feast.repo_contents.RepoContents, skip_source_validation: bool)[source]
feast.repo_operations.cli_check_repo(repo_path: pathlib.Path)[source]
feast.repo_operations.extract_objects_for_apply_delete(project, registry, repo)[source]
feast.repo_operations.generate_project_name() str[source]

Generates a unique project name

feast.repo_operations.get_ignore_files(repo_root: pathlib.Path, ignore_paths: List[str]) Set[pathlib.Path][source]

Get all ignore files that match any of the user-defined ignore paths

feast.repo_operations.get_repo_files(repo_root: pathlib.Path) List[pathlib.Path][source]

Get the list of all repo files, ignoring undesired files & directories specified in .feastignore

feast.repo_operations.init_repo(repo_name: str, template: str)[source]
feast.repo_operations.is_valid_name(name: str) bool[source]

A name should be alphanumeric values and underscores but not start with an underscore

feast.repo_operations.log_infra_changes(views_to_keep: Set[feast.feature_view.FeatureView], views_to_delete: Set[feast.feature_view.FeatureView])[source]
feast.repo_operations.parse_repo(repo_root: pathlib.Path) feast.repo_contents.RepoContents[source]

Collect feature table definitions from feature repo

feast.repo_operations.plan(repo_config: feast.repo_config.RepoConfig, repo_path: pathlib.Path, skip_source_validation: bool)[source]
feast.repo_operations.py_path_to_module(path: pathlib.Path, repo_root: pathlib.Path) str[source]
feast.repo_operations.read_feastignore(repo_root: pathlib.Path) List[str][source]

Read .feastignore in the repo root directory (if exists) and return the list of user-defined ignore paths

feast.repo_operations.registry_dump(repo_config: feast.repo_config.RepoConfig, repo_path: pathlib.Path)[source]

For debugging only: output contents of the metadata registry

feast.repo_operations.replace_str_in_file(file_path, match_str, sub_str)[source]
feast.repo_operations.teardown(repo_config: feast.repo_config.RepoConfig, repo_path: pathlib.Path)[source]

feast.telemetry module

feast.type_map module

feast.type_map.bq_to_feast_value_type(bq_type_as_str: str) feast.value_type.ValueType[source]
feast.type_map.feast_value_type_to_pandas_type(value_type: feast.value_type.ValueType) Any[source]
feast.type_map.feast_value_type_to_python_type(field_value_proto: feast.types.Value_pb2.Value) Any[source]

Converts field value Proto to Dict and returns each field’s Feast Value Type value in their respective Python value.

Parameters

field_value_proto – Field value Proto

Returns

Python native type representation/version of the given field_value_proto

feast.type_map.pa_to_feast_value_type(pa_type_as_str: str) feast.value_type.ValueType[source]
feast.type_map.pa_to_redshift_value_type(pa_type: pyarrow.lib.DataType) str[source]
feast.type_map.python_type_to_feast_value_type(name: str, value: Optional[Any] = None, recurse: bool = True, type_name: Optional[str] = None) feast.value_type.ValueType[source]

Finds the equivalent Feast Value Type for a Python value. Both native and Pandas types are supported. This function will recursively look for nested types when arrays are detected. All types must be homogenous.

Parameters
  • name – Name of the value or field

  • value – Value that will be inspected

  • recurse – Whether to recursively look for nested types in arrays

Returns

Feast Value Type

feast.type_map.python_values_to_feast_value_type(name: str, values: Any, recurse: bool = True) feast.value_type.ValueType[source]
feast.type_map.python_values_to_proto_values(values: List[Any], feature_type: feast.value_type.ValueType = ValueType.UNKNOWN) List[feast.types.Value_pb2.Value][source]
feast.type_map.redshift_to_feast_value_type(redshift_type_as_str: str) feast.value_type.ValueType[source]
feast.type_map.snowflake_python_type_to_feast_value_type(snowflake_python_type_as_str: str) feast.value_type.ValueType[source]
feast.type_map.spark_schema_to_np_dtypes(dtypes: List[Tuple[str, str]]) Iterator[numpy.dtype][source]
feast.type_map.spark_to_feast_value_type(spark_type_as_str: str) feast.value_type.ValueType[source]

feast.utils module

feast.utils.make_tzaware(t: datetime.datetime) datetime.datetime[source]

We assume tz-naive datetimes are UTC

feast.utils.to_naive_utc(ts: datetime.datetime) datetime.datetime[source]

feast.value_type module

class feast.value_type.ValueType(value)[source]

Bases: enum.Enum

Feature value type. Used to define data types in Feature Tables.

BOOL = 7
BOOL_LIST = 17
BYTES = 1
BYTES_LIST = 11
DOUBLE = 5
DOUBLE_LIST = 15
FLOAT = 6
FLOAT_LIST = 16
INT32 = 3
INT32_LIST = 13
INT64 = 4
INT64_LIST = 14
NULL = 19
STRING = 2
STRING_LIST = 12
UNIX_TIMESTAMP = 8
UNIX_TIMESTAMP_LIST = 18
UNKNOWN = 0

feast.version module

feast.version.get_version()[source]

Returns version information of the Feast Python Package.

feast.wait module

feast.wait.wait_retry_backoff(retry_fn: Callable[[], Tuple[Any, bool]], timeout_secs: int = 0, timeout_msg: Optional[str] = 'Timeout while waiting for retry_fn() to return True', max_interval_secs: int = 60) Any[source]

Repeatedly try calling given retry_fn until it returns a True boolean success flag. Waits with a exponential backoff between retries until timeout when it throws TimeoutError. :param retry_fn: Callable that returns a result and a boolean success flag. :param timeout_secs: timeout in seconds to give up retrying and throw TimeoutError,

or 0 to retry perpetually.

Parameters
  • timeout_msg – Message to use when throwing TimeoutError.

  • max_interval_secs – max wait in seconds to wait between retries.

Returns

Returned Result from retry_fn() if success flag is True.

Module contents

class feast.BigQuerySource(name: Optional[str] = None, event_timestamp_column: Optional[str] = '', table: Optional[str] = None, table_ref: Optional[str] = None, created_timestamp_column: Optional[str] = '', field_mapping: Optional[Dict[str, str]] = None, date_partition_column: Optional[str] = '', query: Optional[str] = None)[source]

Bases: feast.data_source.DataSource

created_timestamp_column: str
date_partition_column: str
event_timestamp_column: str
field_mapping: Dict[str, str]
static from_proto(data_source: feast.core.DataSource_pb2.DataSource)[source]

Converts data source config in protobuf spec to a DataSource class object.

Parameters

data_source – A protobuf representation of a DataSource.

Returns

A DataSource class object.

Raises

ValueError – The type of DataSource could not be identified.

get_table_column_names_and_types(config: feast.repo_config.RepoConfig) Iterable[Tuple[str, str]][source]

Returns the list of column names and raw column types.

Parameters

config – Configuration object used to configure a feature store.

get_table_query_string() str[source]

Returns a string that can directly be used to reference this table in SQL

name: str
property query
static source_datatype_to_feast_value_type() Callable[[str], feast.value_type.ValueType][source]

Returns the callable method that returns Feast type given the raw column type.

property table_ref
to_proto() feast.core.DataSource_pb2.DataSource[source]

Converts a DataSourceProto object to its protobuf representation.

validate(config: feast.repo_config.RepoConfig)[source]

Validates the underlying data source.

Parameters

config – Configuration object used to configure a feature store.

class feast.Entity(name: str, value_type: feast.value_type.ValueType = ValueType.UNKNOWN, description: str = '', join_key: Optional[str] = None, tags: Dict[str, str] = None, labels: Optional[Dict[str, str]] = None, owner: str = '')[source]

Bases: object

An entity defines a collection of entities for which features can be defined. An entity can also contain associated metadata.

name

The unique name of the entity.

Type

str

value_type

The type of the entity, such as string or float.

Type

feast.value_type.ValueType

join_key

A property that uniquely identifies different entities within the collection. The join_key property is typically used for joining entities with their associated features. If not specified, defaults to the name.

Type

str

description

A human-readable description.

Type

str

tags

A dictionary of key-value pairs to store arbitrary metadata.

Type

Dict[str, str]

owner

The owner of the feature service, typically the email of the primary maintainer.

Type

str

created_timestamp

The time when the entity was created.

Type

Optional[datetime.datetime]

last_updated_timestamp

The time when the entity was last updated.

Type

Optional[datetime.datetime]

created_timestamp: Optional[datetime.datetime]
description: str
classmethod from_proto(entity_proto: feast.core.Entity_pb2.Entity)[source]

Creates an entity from a protobuf representation of an entity.

Parameters

entity_proto – A protobuf representation of an entity.

Returns

An Entity object based on the entity protobuf.

is_valid()[source]

Validates the state of this entity locally.

Raises

ValueError – The entity does not have a name or does not have a type.

join_key: str
last_updated_timestamp: Optional[datetime.datetime]
name: str
owner: str
tags: Dict[str, str]
to_proto() feast.core.Entity_pb2.Entity[source]

Converts an entity object to its protobuf representation.

Returns

An EntityProto protobuf.

value_type: feast.value_type.ValueType
class feast.Feature(name: str, dtype: feast.value_type.ValueType, labels: Optional[Dict[str, str]] = None)[source]

Bases: object

A Feature represents a class of serveable feature.

Parameters
  • name – Name of the feature.

  • dtype – The type of the feature, such as string or float.

  • labels (optional) – User-defined metadata in dictionary form.

property dtype: feast.value_type.ValueType

Gets the data type of this feature.

classmethod from_proto(feature_proto: feast.core.Feature_pb2.FeatureSpecV2)[source]
Parameters

feature_proto – FeatureSpecV2 protobuf object

Returns

Feature object

property labels: Dict[str, str]

Gets the labels of this feature.

property name

Gets the name of this feature.

to_proto() feast.core.Feature_pb2.FeatureSpecV2[source]

Converts Feature object to its Protocol Buffer representation.

Returns

A FeatureSpecProto protobuf.

class feast.FeatureService(name: str, features: List[Union[feast.feature_view.FeatureView, feast.on_demand_feature_view.OnDemandFeatureView]], tags: Dict[str, str] = None, description: str = '', owner: str = '')[source]

Bases: object

A feature service defines a logical group of features from one or more feature views. This group of features can be retrieved together during training or serving.

name

The unique name of the feature service.

Type

str

feature_view_projections

A list containing feature views and feature view projections, representing the features in the feature service.

Type

List[feast.feature_view_projection.FeatureViewProjection]

description

A human-readable description.

Type

str

tags

A dictionary of key-value pairs to store arbitrary metadata.

Type

Dict[str, str]

owner

The owner of the feature service, typically the email of the primary maintainer.

Type

str

created_timestamp

The time when the feature service was created.

Type

Optional[datetime.datetime]

last_updated_timestamp

The time when the feature service was last updated.

Type

Optional[datetime.datetime]

created_timestamp: Optional[datetime.datetime] = None
description: str
feature_view_projections: List[feast.feature_view_projection.FeatureViewProjection]
classmethod from_proto(feature_service_proto: feast.core.FeatureService_pb2.FeatureService)[source]

Converts a FeatureServiceProto to a FeatureService object.

Parameters

feature_service_proto – A protobuf representation of a FeatureService.

last_updated_timestamp: Optional[datetime.datetime] = None
name: str
owner: str
tags: Dict[str, str]
to_proto() feast.core.FeatureService_pb2.FeatureService[source]

Converts a feature service to its protobuf representation.

Returns

A FeatureServiceProto protobuf.

validate()[source]
class feast.FeatureStore(repo_path: Optional[str] = None, config: Optional[feast.repo_config.RepoConfig] = None)[source]

Bases: object

A FeatureStore object is used to define, create, and retrieve features.

Parameters
  • repo_path (optional) – Path to a feature_store.yaml used to configure the feature store.

  • config (optional) – Configuration object used to configure the feature store.

apply(objects: Union[feast.data_source.DataSource, feast.entity.Entity, feast.feature_view.FeatureView, feast.on_demand_feature_view.OnDemandFeatureView, feast.request_feature_view.RequestFeatureView, feast.feature_service.FeatureService, List[Union[feast.feature_view.FeatureView, feast.on_demand_feature_view.OnDemandFeatureView, feast.request_feature_view.RequestFeatureView, feast.entity.Entity, feast.feature_service.FeatureService, feast.data_source.DataSource]]], objects_to_delete: Optional[List[Union[feast.feature_view.FeatureView, feast.on_demand_feature_view.OnDemandFeatureView, feast.request_feature_view.RequestFeatureView, feast.entity.Entity, feast.feature_service.FeatureService, feast.data_source.DataSource]]] = None, partial: bool = True)[source]

Register objects to metadata store and update related infrastructure.

The apply method registers one or more definitions (e.g., Entity, FeatureView) and registers or updates these objects in the Feast registry. Once the apply method has updated the infrastructure (e.g., create tables in an online store), it will commit the updated registry. All operations are idempotent, meaning they can safely be rerun.

Parameters
  • objects – A single object, or a list of objects that should be registered with the Feature Store.

  • objects_to_delete – A list of objects to be deleted from the registry and removed from the provider’s infrastructure. This deletion will only be performed if partial is set to False.

  • partial – If True, apply will only handle the specified objects; if False, apply will also delete all the objects in objects_to_delete, and tear down any associated cloud resources.

Raises

ValueError – The ‘objects’ parameter could not be parsed properly.

Examples

Register an Entity and a FeatureView.

>>> from feast import FeatureStore, Entity, FeatureView, Feature, ValueType, FileSource, RepoConfig
>>> from datetime import timedelta
>>> fs = FeatureStore(repo_path="feature_repo")
>>> driver = Entity(name="driver_id", value_type=ValueType.INT64, description="driver id")
>>> driver_hourly_stats = FileSource(
...     path="feature_repo/data/driver_stats.parquet",
...     event_timestamp_column="event_timestamp",
...     created_timestamp_column="created",
... )
>>> driver_hourly_stats_view = FeatureView(
...     name="driver_hourly_stats",
...     entities=["driver_id"],
...     ttl=timedelta(seconds=86400 * 1),
...     batch_source=driver_hourly_stats,
... )
>>> fs.apply([driver_hourly_stats_view, driver]) # register entity and feature view
config: feast.repo_config.RepoConfig
create_saved_dataset(from_: feast.infra.offline_stores.offline_store.RetrievalJob, name: str, storage: feast.saved_dataset.SavedDatasetStorage, tags: Optional[Dict[str, str]] = None, feature_service: Optional[feast.feature_service.FeatureService] = None, profiler: Optional[feast.dqm.profilers.ge_profiler.GEProfiler] = None) feast.saved_dataset.SavedDataset[source]

Execute provided retrieval job and persist its outcome in given storage. Storage type (eg, BigQuery or Redshift) must be the same as globally configured offline store. After data successfully persisted saved dataset object with dataset metadata is committed to the registry. Name for the saved dataset should be unique within project, since it’s possible to overwrite previously stored dataset with the same name.

Returns

SavedDataset object with attached RetrievalJob

Raises

ValueError if given retrieval job doesn't have metadata

delete_feature_service(name: str)[source]

Deletes a feature service.

Parameters

name – Name of feature service.

Raises

FeatureServiceNotFoundException – The feature view could not be found.

delete_feature_view(name: str)[source]

Deletes a feature view.

Parameters

name – Name of feature view.

Raises

FeatureViewNotFoundException – The feature view could not be found.

static ensure_request_data_values_exist(needed_request_data: Set[str], needed_request_fv_features: Set[str], request_data_features: Dict[str, List[Any]])[source]
get_data_source(name: str) feast.data_source.DataSource[source]

Retrieves the list of data sources from the registry.

Parameters

name – Name of the data source.

Returns

The specified data source.

Raises

DataSourceObjectNotFoundException – The data source could not be found.

get_entity(name: str) feast.entity.Entity[source]

Retrieves an entity.

Parameters

name – Name of entity.

Returns

The specified entity.

Raises

EntityNotFoundException – The entity could not be found.

get_feature_server_endpoint() Optional[str][source]

Returns endpoint for the feature server, if it exists.

get_feature_service(name: str, allow_cache: bool = False) feast.feature_service.FeatureService[source]

Retrieves a feature service.

Parameters
  • name – Name of feature service.

  • allow_cache – Whether to allow returning feature services from a cached registry.

Returns

The specified feature service.

Raises

FeatureServiceNotFoundException – The feature service could not be found.

get_feature_view(name: str) feast.feature_view.FeatureView[source]

Retrieves a feature view.

Parameters

name – Name of feature view.

Returns

The specified feature view.

Raises

FeatureViewNotFoundException – The feature view could not be found.

get_historical_features(entity_df: Union[pandas.core.frame.DataFrame, str], features: Union[List[str], feast.feature_service.FeatureService], full_feature_names: bool = False) feast.infra.offline_stores.offline_store.RetrievalJob[source]

Enrich an entity dataframe with historical feature values for either training or batch scoring.

This method joins historical feature data from one or more feature views to an entity dataframe by using a time travel join.

Each feature view is joined to the entity dataframe using all entities configured for the respective feature view. All configured entities must be available in the entity dataframe. Therefore, the entity dataframe must contain all entities found in all feature views, but the individual feature views can have different entities.

Time travel is based on the configured TTL for each feature view. A shorter TTL will limit the amount of scanning that will be done in order to find feature data for a specific entity key. Setting a short TTL may result in null values being returned.

Parameters
  • entity_df (Union[pd.DataFrame, str]) – An entity dataframe is a collection of rows containing all entity columns (e.g., customer_id, driver_id) on which features need to be joined, as well as a event_timestamp column used to ensure point-in-time correctness. Either a Pandas DataFrame can be provided or a string SQL query. The query must be of a format supported by the configured offline store (e.g., BigQuery)

  • features – A list of features, that should be retrieved from the offline store. Either a list of string feature references can be provided or a FeatureService object. Feature references are of the format “feature_view:feature”, e.g., “customer_fv:daily_transactions”.

  • full_feature_names – A boolean that provides the option to add the feature view prefixes to the feature names, changing them from the format “feature” to “feature_view__feature” (e.g., “daily_transactions” changes to “customer_fv__daily_transactions”). By default, this value is set to False.

Returns

RetrievalJob which can be used to materialize the results.

Raises

ValueError – Both or neither of features and feature_refs are specified.

Examples

Retrieve historical features from a local offline store.

>>> from feast import FeatureStore, RepoConfig
>>> import pandas as pd
>>> fs = FeatureStore(repo_path="feature_repo")
>>> entity_df = pd.DataFrame.from_dict(
...     {
...         "driver_id": [1001, 1002],
...         "event_timestamp": [
...             datetime(2021, 4, 12, 10, 59, 42),
...             datetime(2021, 4, 12, 8, 12, 10),
...         ],
...     }
... )
>>> retrieval_job = fs.get_historical_features(
...     entity_df=entity_df,
...     features=[
...         "driver_hourly_stats:conv_rate",
...         "driver_hourly_stats:acc_rate",
...         "driver_hourly_stats:avg_daily_trips",
...     ],
... )
>>> feature_data = retrieval_job.to_df()
static get_needed_request_data(grouped_odfv_refs: List[Tuple[feast.on_demand_feature_view.OnDemandFeatureView, List[str]]], grouped_request_fv_refs: List[Tuple[feast.request_feature_view.RequestFeatureView, List[str]]]) Tuple[Set[str], Set[str]][source]
get_on_demand_feature_view(name: str) feast.on_demand_feature_view.OnDemandFeatureView[source]

Retrieves a feature view.

Parameters

name – Name of feature view.

Returns

The specified feature view.

Raises

FeatureViewNotFoundException – The feature view could not be found.

get_online_features(features: Union[List[str], feast.feature_service.FeatureService], entity_rows: List[Dict[str, Any]], full_feature_names: bool = False) feast.online_response.OnlineResponse[source]

Retrieves the latest online feature data.

Note: This method will download the full feature registry the first time it is run. If you are using a remote registry like GCS or S3 then that may take a few seconds. The registry remains cached up to a TTL duration (which can be set to infinity). If the cached registry is stale (more time than the TTL has passed), then a new registry will be downloaded synchronously by this method. This download may introduce latency to online feature retrieval. In order to avoid synchronous downloads, please call refresh_registry() prior to the TTL being reached. Remember it is possible to set the cache TTL to infinity (cache forever).

Parameters
  • features – List of feature references that will be returned for each entity. Each feature reference should have the following format: “feature_view:feature” where “feature_view” & “feature” refer to the Feature and FeatureView names respectively. Only the feature name is required.

  • entity_rows – A list of dictionaries where each key-value is an entity-name, entity-value pair.

Returns

OnlineResponse containing the feature data in records.

Raises

Exception – No entity with the specified name exists.

Examples

Materialize all features into the online store over the interval from 3 hours ago to 10 minutes ago, and then retrieve these online features.

>>> from feast import FeatureStore, RepoConfig
>>> fs = FeatureStore(repo_path="feature_repo")
>>> online_response = fs.get_online_features(
...     features=[
...         "driver_hourly_stats:conv_rate",
...         "driver_hourly_stats:acc_rate",
...         "driver_hourly_stats:avg_daily_trips",
...     ],
...     entity_rows=[{"driver_id": 1001}, {"driver_id": 1002}, {"driver_id": 1003}, {"driver_id": 1004}],
... )
>>> online_response_dict = online_response.to_dict()
get_saved_dataset(name: str) feast.saved_dataset.SavedDataset[source]

Find a saved dataset in the registry by provided name and create a retrieval job to pull whole dataset from storage (offline store).

If dataset couldn’t be found by provided name SavedDatasetNotFound exception will be raised.

Data will be retrieved from globally configured offline store.

Returns

SavedDataset with RetrievalJob attached

Raises

SavedDatasetNotFound

list_data_sources(allow_cache: bool = False) List[feast.data_source.DataSource][source]

Retrieves the list of data sources from the registry.

Parameters

allow_cache – Whether to allow returning data sources from a cached registry.

Returns

A list of data sources.

list_entities(allow_cache: bool = False) List[feast.entity.Entity][source]

Retrieves the list of entities from the registry.

Parameters

allow_cache – Whether to allow returning entities from a cached registry.

Returns

A list of entities.

list_feature_services() List[feast.feature_service.FeatureService][source]

Retrieves the list of feature services from the registry.

Returns

A list of feature services.

list_feature_views(allow_cache: bool = False) List[feast.feature_view.FeatureView][source]

Retrieves the list of feature views from the registry.

Parameters

allow_cache – Whether to allow returning entities from a cached registry.

Returns

A list of feature views.

list_on_demand_feature_views(allow_cache: bool = False) List[feast.on_demand_feature_view.OnDemandFeatureView][source]

Retrieves the list of on demand feature views from the registry.

Returns

A list of on demand feature views.

list_request_feature_views(allow_cache: bool = False) List[feast.request_feature_view.RequestFeatureView][source]

Retrieves the list of feature views from the registry.

Parameters

allow_cache – Whether to allow returning entities from a cached registry.

Returns

A list of feature views.

materialize(start_date: datetime.datetime, end_date: datetime.datetime, feature_views: Optional[List[str]] = None) None[source]

Materialize data from the offline store into the online store.

This method loads feature data in the specified interval from either the specified feature views, or all feature views if none are specified, into the online store where it is available for online serving.

Parameters
  • start_date (datetime) – Start date for time range of data to materialize into the online store

  • end_date (datetime) – End date for time range of data to materialize into the online store

  • feature_views (List[str]) – Optional list of feature view names. If selected, will only run materialization for the specified feature views.

Examples

Materialize all features into the online store over the interval from 3 hours ago to 10 minutes ago.

>>> from feast import FeatureStore, RepoConfig
>>> from datetime import datetime, timedelta
>>> fs = FeatureStore(repo_path="feature_repo")
>>> fs.materialize(
...     start_date=datetime.utcnow() - timedelta(hours=3), end_date=datetime.utcnow() - timedelta(minutes=10)
... )
Materializing...

...
materialize_incremental(end_date: datetime.datetime, feature_views: Optional[List[str]] = None) None[source]

Materialize incremental new data from the offline store into the online store.

This method loads incremental new feature data up to the specified end time from either the specified feature views, or all feature views if none are specified, into the online store where it is available for online serving. The start time of the interval materialized is either the most recent end time of a prior materialization or (now - ttl) if no such prior materialization exists.

Parameters
  • end_date (datetime) – End date for time range of data to materialize into the online store

  • feature_views (List[str]) – Optional list of feature view names. If selected, will only run materialization for the specified feature views.

Raises

Exception – A feature view being materialized does not have a TTL set.

Examples

Materialize all features into the online store up to 5 minutes ago.

>>> from feast import FeatureStore, RepoConfig
>>> from datetime import datetime, timedelta
>>> fs = FeatureStore(repo_path="feature_repo")
>>> fs.materialize_incremental(end_date=datetime.utcnow() - timedelta(minutes=5))
Materializing...

...
property project: str

Gets the project of this feature store.

refresh_registry()[source]

Fetches and caches a copy of the feature registry in memory.

Explicitly calling this method allows for direct control of the state of the registry cache. Every time this method is called the complete registry state will be retrieved from the remote registry store backend (e.g., GCS, S3), and the cache timer will be reset. If refresh_registry() is run before get_online_features() is called, then get_online_feature() will use the cached registry instead of retrieving (and caching) the registry itself.

Additionally, the TTL for the registry cache can be set to infinity (by setting it to 0), which means that refresh_registry() will become the only way to update the cached registry. If the TTL is set to a value greater than 0, then once the cache becomes stale (more time than the TTL has passed), a new cache will be downloaded synchronously, which may increase latencies if the triggering method is get_online_features()

property registry: feast.registry.Registry

Gets the registry of this feature store.

repo_path: pathlib.Path
serve(host: str, port: int, no_access_log: bool) None[source]

Start the feature consumption server locally on a given port.

serve_transformations(port: int) None[source]

Start the feature transformation server locally on a given port.

teardown()[source]

Tears down all local and cloud resources for the feature store.

version() str[source]

Returns the version of the current Feast SDK/CLI.

write_to_online_store(feature_view_name: str, df: pandas.core.frame.DataFrame, allow_registry_cache: bool = True)[source]

ingests data directly into the Online store

class feast.FeatureView(name: str, entities: List[str], ttl: Union[google.protobuf.duration_pb2.Duration, datetime.timedelta], input: Optional[feast.data_source.DataSource] = None, batch_source: Optional[feast.data_source.DataSource] = None, stream_source: Optional[feast.data_source.DataSource] = None, features: Optional[List[feast.feature.Feature]] = None, tags: Optional[Dict[str, str]] = None, online: bool = True)[source]

Bases: feast.base_feature_view.BaseFeatureView

A FeatureView defines a logical grouping of serveable features.

Parameters
  • name – Name of the group of features.

  • entities – The entities to which this group of features is associated.

  • ttl – The amount of time this group of features lives. A ttl of 0 indicates that this group of features lives forever. Note that large ttl’s or a ttl of 0 can result in extremely computationally intensive queries.

  • input – The source of data where this group of features is stored.

  • batch_source (optional) – The batch source of data where this group of features is stored.

  • stream_source (optional) – The stream source of data where this group of features is stored.

  • features (optional) – The set of features defined as part of this FeatureView.

  • tags (optional) – A dictionary of key-value pairs used for organizing FeatureViews.

batch_source: feast.data_source.DataSource
created_timestamp: Optional[datetime.datetime]
ensure_valid()[source]

Validates the state of this feature view locally.

Raises

ValueError – The feature view does not have a name or does not have entities.

entities: List[str]
classmethod from_proto(feature_view_proto: feast.core.FeatureView_pb2.FeatureView)[source]

Creates a feature view from a protobuf representation of a feature view.

Parameters

feature_view_proto – A protobuf representation of a feature view.

Returns

A FeatureViewProto object based on the feature view protobuf.

input: feast.data_source.DataSource
last_updated_timestamp: Optional[datetime.datetime]
materialization_intervals: List[Tuple[datetime.datetime, datetime.datetime]]
property most_recent_end_time: Optional[datetime.datetime]

Retrieves the latest time up to which the feature view has been materialized.

Returns

The latest time, or None if the feature view has not been materialized.

online: bool
property proto_class: Type[feast.core.FeatureView_pb2.FeatureView]
stream_source: Optional[feast.data_source.DataSource]
tags: Optional[Dict[str, str]]
to_proto() feast.core.FeatureView_pb2.FeatureView[source]

Converts a feature view object to its protobuf representation.

Returns

A FeatureViewProto protobuf.

ttl: datetime.timedelta
with_join_key_map(join_key_map: Dict[str, str])[source]

Sets the join_key_map by returning a copy of this feature view with that field set. This join_key mapping operation is only used as part of query operations and will not modify the underlying FeatureView.

Parameters

join_key_map – A map of join keys in which the left is the join_key that corresponds with the feature data and the right corresponds with the entity data.

Returns

A copy of this FeatureView with the join_key_map replaced with the ‘join_key_map’ input.

Examples

Join a location feature data table to both the origin column and destination column of the entity data.

temperatures_feature_service = FeatureService(

name=”temperatures”, features=[

location_stats_feature_view

.with_name(“origin_stats”) .with_join_key_map(

{“location_id”: “origin_id”}

),

location_stats_feature_view

.with_name(“destination_stats”) .with_join_key_map(

{“location_id”: “destination_id”}

),

],

)

with_name(name: str)[source]

Renames this feature view by returning a copy of this feature view with an alias set for the feature view name. This rename operation is only used as part of query operations and will not modify the underlying FeatureView.

Parameters

name – Name to assign to the FeatureView copy.

Returns

A copy of this FeatureView with the name replaced with the ‘name’ input.

with_projection(feature_view_projection: feast.feature_view_projection.FeatureViewProjection)[source]

Sets the feature view projection by returning a copy of this feature view with its projection set to the given projection. A projection is an object that stores the modifications to a feature view that is used during query operations.

Parameters

feature_view_projection – The FeatureViewProjection object to link to this OnDemandFeatureView.

Returns

A copy of this FeatureView with its projection replaced with the ‘feature_view_projection’ argument.

class feast.FileSource(path: str, name: Optional[str] = '', event_timestamp_column: Optional[str] = '', file_format: Optional[feast.data_format.FileFormat] = None, created_timestamp_column: Optional[str] = '', field_mapping: Optional[Dict[str, str]] = None, date_partition_column: Optional[str] = '', s3_endpoint_override: Optional[str] = None)[source]

Bases: feast.data_source.DataSource

static create_filesystem_and_path(path: str, s3_endpoint_override: str) Tuple[Optional[pyarrow._fs.FileSystem], str][source]
created_timestamp_column: str
date_partition_column: str
event_timestamp_column: str
field_mapping: Dict[str, str]
static from_proto(data_source: feast.core.DataSource_pb2.DataSource)[source]

Converts data source config in protobuf spec to a DataSource class object.

Parameters

data_source – A protobuf representation of a DataSource.

Returns

A DataSource class object.

Raises

ValueError – The type of DataSource could not be identified.

get_table_column_names_and_types(config: feast.repo_config.RepoConfig) Iterable[Tuple[str, str]][source]

Returns the list of column names and raw column types.

Parameters

config – Configuration object used to configure a feature store.

get_table_query_string() str[source]

Returns a string that can directly be used to reference this table in SQL.

name: str
property path

Returns the path of this file data source.

static source_datatype_to_feast_value_type() Callable[[str], feast.value_type.ValueType][source]

Returns the callable method that returns Feast type given the raw column type.

to_proto() feast.core.DataSource_pb2.DataSource[source]

Converts a DataSourceProto object to its protobuf representation.

validate(config: feast.repo_config.RepoConfig)[source]

Validates the underlying data source.

Parameters

config – Configuration object used to configure a feature store.

class feast.KafkaSource(name: str, event_timestamp_column: str, bootstrap_servers: str, message_format: feast.data_format.StreamFormat, topic: str, created_timestamp_column: Optional[str] = '', field_mapping: Optional[Dict[str, str]] = None, date_partition_column: Optional[str] = '')[source]

Bases: feast.data_source.DataSource

created_timestamp_column: str
date_partition_column: str
event_timestamp_column: str
field_mapping: Dict[str, str]
static from_proto(data_source: feast.core.DataSource_pb2.DataSource)[source]

Converts data source config in protobuf spec to a DataSource class object.

Parameters

data_source – A protobuf representation of a DataSource.

Returns

A DataSource class object.

Raises

ValueError – The type of DataSource could not be identified.

get_table_column_names_and_types(config: feast.repo_config.RepoConfig) Iterable[Tuple[str, str]][source]

Returns the list of column names and raw column types.

Parameters

config – Configuration object used to configure a feature store.

get_table_query_string() str[source]

Returns a string that can directly be used to reference this table in SQL.

name: str
static source_datatype_to_feast_value_type() Callable[[str], feast.value_type.ValueType][source]

Returns the callable method that returns Feast type given the raw column type.

to_proto() feast.core.DataSource_pb2.DataSource[source]

Converts a DataSourceProto object to its protobuf representation.

validate(config: feast.repo_config.RepoConfig)[source]

Validates the underlying data source.

Parameters

config – Configuration object used to configure a feature store.

class feast.KinesisSource(name: str, event_timestamp_column: str, created_timestamp_column: str, record_format: feast.data_format.StreamFormat, region: str, stream_name: str, field_mapping: Optional[Dict[str, str]] = None, date_partition_column: Optional[str] = '')[source]

Bases: feast.data_source.DataSource

created_timestamp_column: str
date_partition_column: str
event_timestamp_column: str
field_mapping: Dict[str, str]
static from_proto(data_source: feast.core.DataSource_pb2.DataSource)[source]

Converts data source config in protobuf spec to a DataSource class object.

Parameters

data_source – A protobuf representation of a DataSource.

Returns

A DataSource class object.

Raises

ValueError – The type of DataSource could not be identified.

get_table_column_names_and_types(config: feast.repo_config.RepoConfig) Iterable[Tuple[str, str]][source]

Returns the list of column names and raw column types.

Parameters

config – Configuration object used to configure a feature store.

get_table_query_string() str[source]

Returns a string that can directly be used to reference this table in SQL.

name: str
static source_datatype_to_feast_value_type() Callable[[str], feast.value_type.ValueType][source]

Returns the callable method that returns Feast type given the raw column type.

to_proto() feast.core.DataSource_pb2.DataSource[source]

Converts a DataSourceProto object to its protobuf representation.

validate(config: feast.repo_config.RepoConfig)[source]

Validates the underlying data source.

Parameters

config – Configuration object used to configure a feature store.

class feast.OnDemandFeatureView(name: str, features: List[feast.feature.Feature], inputs: Dict[str, Union[feast.feature_view.FeatureView, feast.feature_view_projection.FeatureViewProjection, feast.data_source.RequestDataSource]], udf: method)[source]

Bases: feast.base_feature_view.BaseFeatureView

[Experimental] An OnDemandFeatureView defines on demand transformations on existing feature view values and request data.

Parameters
  • name – Name of the group of features.

  • features – Output schema of transformation with feature names

  • inputs – The input feature views passed into the transform.

  • udf – User defined transformation function that takes as input pandas dataframes

classmethod from_proto(on_demand_feature_view_proto: feast.core.OnDemandFeatureView_pb2.OnDemandFeatureView)[source]

Creates an on demand feature view from a protobuf representation.

Parameters

on_demand_feature_view_proto – A protobuf representation of an on-demand feature view.

Returns

A OnDemandFeatureView object based on the on-demand feature view protobuf.

get_request_data_schema() Dict[str, feast.value_type.ValueType][source]
static get_requested_odfvs(feature_refs, project, registry)[source]
get_transformed_features_df(df_with_features: pandas.core.frame.DataFrame, full_feature_names: bool = False) pandas.core.frame.DataFrame[source]
infer_features()[source]

Infers the set of features associated to this feature view from the input source.

Raises

RegistryInferenceFailure – The set of features could not be inferred.

input_feature_view_projections: Dict[str, feast.feature_view_projection.FeatureViewProjection]
input_request_data_sources: Dict[str, feast.data_source.RequestDataSource]
property proto_class: Type[feast.core.OnDemandFeatureView_pb2.OnDemandFeatureView]
to_proto() feast.core.OnDemandFeatureView_pb2.OnDemandFeatureView[source]

Converts an on demand feature view object to its protobuf representation.

Returns

A OnDemandFeatureViewProto protobuf.

udf: method
class feast.RedshiftSource(name: Optional[str] = None, event_timestamp_column: Optional[str] = '', table: Optional[str] = None, schema: Optional[str] = None, created_timestamp_column: Optional[str] = '', field_mapping: Optional[Dict[str, str]] = None, date_partition_column: Optional[str] = '', query: Optional[str] = None)[source]

Bases: feast.data_source.DataSource

created_timestamp_column: str
date_partition_column: str
event_timestamp_column: str
field_mapping: Dict[str, str]
static from_proto(data_source: feast.core.DataSource_pb2.DataSource)[source]

Creates a RedshiftSource from a protobuf representation of a RedshiftSource.

Parameters

data_source – A protobuf representation of a RedshiftSource

Returns

A RedshiftSource object based on the data_source protobuf.

get_table_column_names_and_types(config: feast.repo_config.RepoConfig) Iterable[Tuple[str, str]][source]

Returns a mapping of column names to types for this Redshift source.

Parameters

config – A RepoConfig describing the feature repo

get_table_query_string() str[source]

Returns a string that can directly be used to reference this table in SQL.

name: str
property query

Returns the Redshift options of this Redshift source.

property schema

Returns the schema of this Redshift source.

static source_datatype_to_feast_value_type() Callable[[str], feast.value_type.ValueType][source]

Returns the callable method that returns Feast type given the raw column type.

property table

Returns the table of this Redshift source.

to_proto() feast.core.DataSource_pb2.DataSource[source]

Converts a RedshiftSource object to its protobuf representation.

Returns

A DataSourceProto object.

validate(config: feast.repo_config.RepoConfig)[source]

Validates the underlying data source.

Parameters

config – Configuration object used to configure a feature store.

class feast.RepoConfig(*, registry: Union[pydantic.types.StrictStr, feast.repo_config.RegistryConfig] = 'data/registry.db', project: pydantic.types.StrictStr, provider: pydantic.types.StrictStr, online_store: Any = None, offline_store: Any = None, feature_server: Any = None, flags: Any = None, repo_path: pathlib.Path = None, **data: Any)[source]

Bases: feast.repo_config.FeastBaseModel

Repo config. Typically loaded from feature_store.yaml

feature_server: Optional[Any]

Feature server configuration (optional depending on provider)

Type

FeatureServerConfig

flags: Any

Feature flags for experimental features (optional)

Type

Flags

get_registry_config()[source]
offline_store: Any

Offline store configuration (optional depending on provider)

Type

OfflineStoreConfig

online_store: Any

Online store configuration (optional depending on provider)

Type

OnlineStoreConfig

project: pydantic.types.StrictStr

Feast project id. This can be any alphanumeric string up to 16 characters. You can have multiple independent feature repositories deployed to the same cloud provider account, as long as they have different project ids.

Type

str

provider: pydantic.types.StrictStr

local or gcp or aws

Type

str

registry: Union[pydantic.types.StrictStr, feast.repo_config.RegistryConfig]

Path to metadata store. Can be a local path, or remote object storage path, e.g. a GCS URI

Type

str

repo_path: Optional[pathlib.Path]
write_to_path(repo_path: pathlib.Path)[source]
class feast.RequestFeatureView(name: str, request_data_source: feast.data_source.RequestDataSource)[source]

Bases: feast.base_feature_view.BaseFeatureView

[Experimental] An RequestFeatureView defines a feature that is available from the inference request.

Parameters
  • name – Name of the group of features.

  • request_data_source – Request data source that specifies the schema and features

classmethod from_proto(request_feature_view_proto: feast.core.RequestFeatureView_pb2.RequestFeatureView)[source]

Creates a request feature view from a protobuf representation.

Parameters

request_feature_view_proto – A protobuf representation of an request feature view.

Returns

A RequestFeatureView object based on the request feature view protobuf.

property proto_class: Type[feast.core.RequestFeatureView_pb2.RequestFeatureView]
request_data_source: feast.data_source.RequestDataSource
to_proto() feast.core.RequestFeatureView_pb2.RequestFeatureView[source]

Converts an request feature view object to its protobuf representation.

Returns

A RequestFeatureViewProto protobuf.

class feast.SnowflakeSource(name: Optional[str] = None, database: Optional[str] = None, schema: Optional[str] = None, table: Optional[str] = None, query: Optional[str] = None, event_timestamp_column: Optional[str] = '', created_timestamp_column: Optional[str] = '', field_mapping: Optional[Dict[str, str]] = None, date_partition_column: Optional[str] = '')[source]

Bases: feast.data_source.DataSource

created_timestamp_column: str
property database

Returns the database of this snowflake source.

date_partition_column: str
event_timestamp_column: str
field_mapping: Dict[str, str]
static from_proto(data_source: feast.core.DataSource_pb2.DataSource)[source]

Creates a SnowflakeSource from a protobuf representation of a SnowflakeSource.

Parameters

data_source – A protobuf representation of a SnowflakeSource

Returns

A SnowflakeSource object based on the data_source protobuf.

get_table_column_names_and_types(config: feast.repo_config.RepoConfig) Iterable[Tuple[str, str]][source]

Returns a mapping of column names to types for this snowflake source.

Parameters

config – A RepoConfig describing the feature repo

get_table_query_string() str[source]

Returns a string that can directly be used to reference this table in SQL.

name: str
property query

Returns the snowflake options of this snowflake source.

property schema

Returns the schema of this snowflake source.

static source_datatype_to_feast_value_type() Callable[[str], feast.value_type.ValueType][source]

Returns the callable method that returns Feast type given the raw column type.

property table

Returns the table of this snowflake source.

to_proto() feast.core.DataSource_pb2.DataSource[source]

Converts a SnowflakeSource object to its protobuf representation.

Returns

A DataSourceProto object.

validate(config: feast.repo_config.RepoConfig)[source]

Validates the underlying data source.

Parameters

config – Configuration object used to configure a feature store.

class feast.SourceType(value)[source]

Bases: enum.Enum

DataSource value type. Used to define source types in DataSource.

BATCH_BIGQUERY = 2
BATCH_FILE = 1
STREAM_KAFKA = 3
STREAM_KINESIS = 4
UNKNOWN = 0
class feast.SparkSource(name: Optional[str] = None, table: Optional[str] = None, query: Optional[str] = None, path: Optional[str] = None, file_format: Optional[str] = None, event_timestamp_column: Optional[str] = None, created_timestamp_column: Optional[str] = None, field_mapping: Optional[Dict[str, str]] = None, date_partition_column: Optional[str] = None)[source]

Bases: feast.data_source.DataSource

created_timestamp_column: str
date_partition_column: str
event_timestamp_column: str
field_mapping: Dict[str, str]
property file_format

Returns the file format of this feature data source.

static from_proto(data_source: feast.core.DataSource_pb2.DataSource) Any[source]

Converts data source config in protobuf spec to a DataSource class object.

Parameters

data_source – A protobuf representation of a DataSource.

Returns

A DataSource class object.

Raises

ValueError – The type of DataSource could not be identified.

get_table_column_names_and_types(config: feast.repo_config.RepoConfig) Iterable[Tuple[str, str]][source]

Returns the list of column names and raw column types.

Parameters

config – Configuration object used to configure a feature store.

get_table_query_string() str[source]

Returns a string that can directly be used to reference this table in SQL

name: str
property path

Returns the path of the spark data source file.

property query

Returns the query of this feature data source

static source_datatype_to_feast_value_type() Callable[[str], feast.value_type.ValueType][source]

Returns the callable method that returns Feast type given the raw column type.

property table

Returns the table of this feature data source

to_proto() feast.core.DataSource_pb2.DataSource[source]

Converts a DataSourceProto object to its protobuf representation.

validate(config: feast.repo_config.RepoConfig)[source]

Validates the underlying data source.

Parameters

config – Configuration object used to configure a feature store.

class feast.ValueType(value)[source]

Bases: enum.Enum

Feature value type. Used to define data types in Feature Tables.

BOOL = 7
BOOL_LIST = 17
BYTES = 1
BYTES_LIST = 11
DOUBLE = 5
DOUBLE_LIST = 15
FLOAT = 6
FLOAT_LIST = 16
INT32 = 3
INT32_LIST = 13
INT64 = 4
INT64_LIST = 14
NULL = 19
STRING = 2
STRING_LIST = 12
UNIX_TIMESTAMP = 8
UNIX_TIMESTAMP_LIST = 18
UNKNOWN = 0