Engineered a Marketing Impact Measurement Framework for a Premium Shoe Manufacturer to Conduct Dynamic Experiments & Reduce Time to Insights

Industry

Retail

Region

The US

Solution

Impact Measurement Framework

The web tool furnished an effective measurement framework to determine the real-time impact of 100+ experiments run across marketing, pricing, and CRM functions

Tool applications helped achieve cost efficiency of over 80%

Tool architecture allowed data democratization across the enterprises without risking confidential data leakage

Experiments were recommended based on historical results and this enabled decision-makers to perform smarter trials within shorter ETA. Time to make decisions was effectively reduced from ~3 days to ~1 day

Built market mix solution to measure the impact of various marketing activities and optimize spends across channels which resulted in incremental marketing impact of 6 percentage points

An upcoming premium shoe manufacturer that invested heavily in marketing campaigns with an intent to deepen their footprint in the hyper-competitive marketplace, wanted to measure efficacy of marketing campaigns to derive better ROI. What initially started as an assignment to create an algorithm to measure marketing campaign efficacy was later scaled to developing a full-fledged impact measurement tool, that could be internalized and contextualized to client’s business needs. The tool was engineered to accept user input, analyze the impact of marketing campaign events in real time, and recommend changes to experiments to optimize their investments and boost operational efficiency. The existing decision-making process was decentralized which made identifying high-ROI generating activities in the campaigns a daunting task

The tool was built to ensure optimum granularity of tests and identify scenarios where lift measurement was essential. From an operational perspective, the tool architecture was built to ensure ease of use and cost efficiency. The focus was also on designing a tool that could be seamlessly internalized in the existing client ecosystem and drive their analytical prowess for the long run.

The tool architecture can be summarized into three parts - the frontend, backend, and the database. The frontend console allowed users to set specific parameters and actions with the click of a few buttons to request specific results. These actions would trigger API calls, which would connect the frontend events to the backend. Activities in the backend would involve ingesting raw data, filtering it to get the relevant data points, and then train data models with the filtered data points. The results furnished by data models would be saved in the database and then relayed to the frontend interface as easy-to-grasp graphical representations. The architecture of the impact measurement tool was engineered using platforms that would drive cost efficiency, enable seamless internalization, and could be contextualized to the unique business roadblocks that our client might face in the future.

Frontend application: A UI development software, preferred by our client, was used to design the frontend console of the tool. It offered visual representation of results derived from the data model analyses. The application layer acted as the connecting link between the UI in the frontend and backend applications. Its interface was embedded with communications protocols in the form of clickable buttons such as “country filter”, “time filter” and so on. These buttons were used by our client to enter input data, and once the data was processed in the backend, view graphical representations of the results. Each such action triggered API calls created by Django rest framework in the backend.

Backend Application: The APIs were migrated to a cloud computing environment using Amazon Elastic Compute Cloud (Amazon EC2) to enable enterprise-wide API access. Leveraging AWS Lambda allowed to trigger an API call on request which eliminated the need to run the tool round the clock, thereby saving on server costs to a significant extent.

The logic behind using Django was to ensure maximum security as this software platform had the required sophistication to encrypt all the data that traveled from the UI to the backend and the database. To support future business requirements, users could introduce new APIs using Django without having to change the tool infrastructure.

Database: The database was built on Snowflake and SQLAlchemy was used as a common language to drive seamless interaction of APIs from the frontend application to the format of the database. In the future, the introduction of newer databases in the tool with different syntaxes will not require a subsequent change in APIs to drive their interaction. To keep the database up to date, Amazon SNS was applied to extract, transform, and load data from multiple sources. Amazon Cloudwatch was deployed to trigger this event at a set time on a daily basis.

Measuring Sales Lift Against Different Variables

The difference in the shoe sale volumes between the period when marketing campaigns were conducted and the period which witnessed no campaigns was considered to measure the lift. As we did not have concrete information available on the latter, we hypothesized the shoe sales volume for that period. To build this hypothesis, we employed a time series forecasting technique and test control analysis. ARIMA, Holt Winters, and UCM were the time series forecasting techniques that were used to analyze the time period and draw meaningful inferences on the shoe sale volumes. The test control analysis was carried out considering two data sources: a) The same market in the previous year b) The same year in a different market Same market last year- To hypothesize the sales performance in the year (say year x) without a marketing campaign in a specific market location, we considered the performance statistics the year just before year x. Same year different market- To measure the impact of marketing campaigns on the sales performance of a specific market location (say Y) in the year x, we identified another market location that almost mirrored the sales performance of the market location Y. We presumed that running marketing campaigns in one location will have similar implications in the market location Y in terms of sales performances.

Related resources

Subscribe to Authentic Intelligence – our monthly newsletter

Stay up to date with the latest marketing, sales, and service tips and news.

X
Subscribe

Subscribe to our newsletter to receive latest updates