Feathr is the feature store that is used in production in LinkedIn for many years and was open sourced in April 2022. Read our announcement on Open Sourcing Feathr and Feathr on Azure.
Feathr lets you:
- Define features based on raw data sources (batch and streaming) using pythonic APIs.
- Register and get features by names during model training and model inference.
- Share features across your team and company.
Feathr automatically computes your feature values and joins them to your training data, using point-in-time-correct semantics to avoid data leakage, and supports materializing and deploying your features for use online in production.
- Battle tested in production for more than 6 years: LinkedIn has been using Feathr in production for over 6 years and have a dedicated team improving it.
- Scalable with built-in optimizations: For example, based on some internal use case, Feathr can process billions of rows and PB scale data with built-in optimizations such as bloom filters and salted joins.
- Rich support for point-in-time joins and aggregations: Feathr has high performant built-in operators designed for Feature Store, including time-based aggregation, sliding window joins, look-up features, all with point-in-time correctness.
- Highly customizable user-defined functions (UDFs) with native PySpark and Spark SQL support to lower the learning curve for data scientists.
- Pythonic APIs to access everything with low learning curve; Integrated with model building so data scientists can be productive from day one.
- Derived Features which is a unique capability across all the feature store solutions. This encourage feature consumers to build features on existing features and encouraging feature reuse.
- Rich type system including support for embeddings for advanced machine learning/deep learning scenarios. One of the common use cases is to build embeddings for customer profiles, and those embeddings can be reused across an organization in all the machine learning applications.
- Native cloud integration with simplified and scalable architecture, which is illustrated in the next section.
- Feature sharing and reuse made easy: Feathr has built-in feature registry so that features can be easily shared across different teams and boost team productivity.
- To enable authentication on the Feathr UI (which gets created as part of the deployment script) we need to create an Azure Active Directory (AAD) application. Currently it is not possible to create one through ARM template but you can easily create one by running the following CLI commands in the Cloud Shell
# This is the prefix you want to name your resources with, make a note of it, you will need it during deployment.
prefix="YOUR_RESOURCE_PREFIX"
# Please don't change this name, a corresponding webapp with same name gets created in subsequent steps.
sitename="${prefix}webapp"
# This will create the Azure AD application, note that we need to create an AAD app of platform type Single Page Application(SPA). By default passing the redirect-uris with create command creates an app of type web.
az ad app create --display-name $sitename --sign-in-audience AzureADMyOrg --web-home-page-url "https://$sitename.azurewebsites.net" --enable-id-token-issuance true
#Fetch the ClientId, TenantId and ObjectId for the created app
aad_clientId=$(az ad app list --display-name $sitename --query [].appId -o tsv)
aad_tenantId=$(az account tenant list --query [].tenantId -o tsv)
aad_objectId=$(az ad app list --display-name $sitename --query [].id -o tsv)
# Updating the SPA app created above, currently there is no CLI support to add redirectUris to a SPA, so we have to patch manually via az rest
az rest --method PATCH --uri "https://graph.microsoft.com/v1.0/applications/$aad_objectId" --headers "Content-Type=application/json" --body "{spa:{redirectUris:['https://$sitename.azurewebsites.net/.auth/login/aad/callback']}}"
# Make a note of the ClientId and TenantId, you will need it during deployment.
echo "AAD_CLIENT_ID: $aad_clientId"
echo "AZURE_TENANT_ID: $aad_tenantId"
- Click the button below to deploy a minimal set of Feathr resources. This is not for production use as we choose a minimal set of resources, but treat it as a template that you can modify for further use. Note that you should have "Owner" access in your subscription to perform some of the actions.
- For more details on Feathr, read our documentation.
- For Python API references, read the Python API Reference.
- For technical talks on Feathr, see the slides here. The recording is here.
If you want to install Feathr client in a python environment, use this:
pip install feathr
Or use the latest code from GitHub:
pip install git+https://github.com/linkedin/feathr.git#subdirectory=feathr_project
Feathr has native integrations with Databricks and Azure Synapse:
- Please read the Quick Start Guide for Feathr on Databricks to run Feathr with Databricks.
- Please read the Quick Start Guide for Feathr on Azure Synapse to run Feathr with Azure Synapse.
Please read Feathr Capabilities for more examples. Below are a few selected ones:
Feathr has highly customizable UDFs with native PySpark and Spark SQL integration to lower learning curve for data scientists:
def add_new_dropoff_and_fare_amount_column(df: DataFrame):
df = df.withColumn("f_day_of_week", dayofweek("lpep_dropoff_datetime"))
df = df.withColumn("fare_amount_cents", df.fare_amount.cast('double') * 100)
return df
batch_source = HdfsSource(name="nycTaxiBatchSource",
path="abfss://[email protected]/demo_data/green_tripdata_2020-04.csv",
preprocessing=add_new_dropoff_and_fare_amount_column,
event_timestamp_column="new_lpep_dropoff_datetime",
timestamp_format="yyyy-MM-dd HH:mm:ss")
agg_features = [Feature(name="f_location_avg_fare",
key=location_id, # Query/join key of the feature(group)
feature_type=FLOAT,
transform=WindowAggTransformation( # Window Aggregation transformation
agg_expr="cast_float(fare_amount)",
agg_func="AVG", # Apply average aggregation over the window
window="90d")), # Over a 90-day window
]
agg_anchor = FeatureAnchor(name="aggregationFeatures",
source=batch_source,
features=agg_features)
# Compute a new feature(a.k.a. derived feature) on top of an existing feature
derived_feature = DerivedFeature(name="f_trip_time_distance",
feature_type=FLOAT,
key=trip_key,
input_features=[f_trip_distance, f_trip_time_duration],
transform="f_trip_distance * f_trip_time_duration")
# Another example to compute embedding similarity
user_embedding = Feature(name="user_embedding", feature_type=DENSE_VECTOR, key=user_key)
item_embedding = Feature(name="item_embedding", feature_type=DENSE_VECTOR, key=item_key)
user_item_similarity = DerivedFeature(name="user_item_similarity",
feature_type=FLOAT,
key=[user_key, item_key],
input_features=[user_embedding, item_embedding],
transform="cosine_similarity(user_embedding, item_embedding)")
Read the Streaming Source Ingestion Guide for more details.
Read Point-in-time Correctness and Point-in-time Join in Feathr for more details.
Follow the quick start Jupyter Notebook to try it out. There is also a companion quick start guide containing a bit more explanation on the notebook.
- Introduction to Feathr - Beginner's guide
- Document Intelligence using Azure Feature Store (Feathr) and SynapseML
Feathr component | Cloud Integrations |
---|---|
Offline store – Object Store | Azure Blob Storage, Azure ADLS Gen2, AWS S3 |
Offline store – SQL | Azure SQL DB, Azure Synapse Dedicated SQL Pools, Azure SQL in VM, Snowflake |
Streaming Source | Kafka, EventHub |
Online store | Azure Cache for Redis |
Feature Registry and Governance | Azure Purview |
Compute Engine | Azure Synapse Spark Pools, Databricks |
Machine Learning Platform | Azure Machine Learning, Jupyter Notebook, Databricks Notebook |
File Format | Parquet, ORC, Avro, JSON, Delta Lake |
Credentials | Azure Key Vault |
For a complete roadmap with estimated dates, please visit this page.
- Support streaming
- Support common data sources
- Support online transformation
- Support feature versioning
- Support feature monitoring
- Support feature store UI, including Lineage and Search functionalities
- Support feature data deletion and retention
Build for the community and build by the community. Check out Community Guidelines.
Join our Slack channel for questions and discussions (or click the invitation link).