The solution here aims to use AI/ML models to enhance NBA planning efficiency and effectiveness by leveraging machine learning techniques and comprehensive data analysis, and provide curated recommendation at a monthly/weekly level based on the types of constraints set.
Here we have create a ML model, which generates NBA predictions based on the input data and user input contraints from GUI.
- Quarterly Guardrails: Establishes guardrails at the quarter level, encompassing engagement goals, vendor contracts, and other constraints to guide NBA planning.
- Budget Optimization: Consolidates budgets for Integrated Promotional Planning (IPP) to optimize touchpoint volume recommendations at the HCP level.
- Temporal Adjustment: Converts HCP quarter recommendations into monthly recommendations using historical two-month actual promotion data.
- Optimal Touchpoint Distribution: Develops an optimal distribution and sequence of touchpoints utilizing decision tree-based models or other approaches to maximize effectiveness. Data
- Data Files include HCP Data including channel priority, historical hcp data, vendor contract and much more, which are all reference to generate NBA plan based on cadence date.
There is one Notebook and One Script in the package:
- Model Configurations: Notebook for configuring the environment manually using the variables.
- Model input parameters: Notebook containing the GUI for configuring the environment, and execute the model.
- Data Files Setup: Notebook to take the filePath from environment and create/update the Databricks Catalog with the tables required for model execution.
- Model Workflow: Notebook containing all the model functions and logic required for processing NBA.
- Model Orchestrator: Notebook to run the model workflow with a single execution.
- Dashboard setup instructions: This notebook lists all the steps required to detup the dashboard in the databricks environment.
If you are new to Databricks, create an account at: https://databricks.com/try-databricks
- Create a Databricks Cluster with Databricks Compute 14.3 LTS.
- Install the following libraries by going to your created cluster, and navigating to libraries tab:
- Click on "Install New".
- Navigate to PyPI.
- Pass the Librariy Names one ata time and install all the required libraries for setup.
- PyYAML
- gekko
- kaleido
- pyspark
Several omnichannel-specific datasets have been used to build and run the model. Sample datasets can be found in the "Data_Files" folder, along with an additional README "Data File Information.png" file containing data specifications.