Large Quantitative Data Sets and Qualitative Research
Introduction: Uncovering Inconsistencies through Internal Feedback
The impetus for this project was a finding that internal employees were identifying discrepancies within their buyer service level reports, in addition to customer complaints about incorrect data.
Discovery:
During in-depth interviews with internal stakeholders—ranging from Procurement Buyers to Business Analysts—an intriguing trend emerged. Employees shared their experiences of encountering inconsistencies when reviewing historical service level data from their reports.
Early Findings:
The interviews conducted provided invaluable insights into the nature of the discrepancies. Specifically, the attribute "date" presented itself in BI tools approximately seven different formats—ranging from "Julian date", "Fiscal year date", "Calendar date", "Ordered Date", "week start date", and "week end date." This divergence in data representation not only caused confusion but also hindered the ability to draw accurate historical comparisons.
Through interviews and using behavior analytics, I found that employee report error rates were increasing and leading to frustration, extra time needed to correct errors, and customer complaints.

Research Approach and Methodology
Unveiling User Insights through Technical Exploration
To address the complex challenge of service level reporting inaccuracies, I combined qualitative insights with quantitative data patterns and technical acumen to shed light on the intricacies of the problem.
Data Collection and User Interviews:
The initial step involved data collection from 1:1 semi-structured interviews with stakeholders, including customer representatives and database administrators, to generate understanding and unearth pain points that contributed to the frustrating user experience & discrepancies within the Buyer Report.
Leveraging SQL and Python:
Through SQL queries, I tracked the paths employees followed when constructing reports. This allowed me to observe the sequence of actions leading to data input and report generation. Python scripts facilitated the extraction and analysis of data entry patterns, revealing the varied interpretations of the "date" attribute.
Patterns Emerged When Viewing Employee Behavioral Analytics Using BI Tools

To replicate behavioral analytic findings, I used SQL (Teradata) to pull data from multiple sources, and join them using different dates.

Findings
Historical Inaccuracies: This discrepancy highlighted the urgency to correct the inconsistencies and regain user trust in the reported metrics.
Oracle Transition Impact: Employees felt uncertainty surrounding information flow after the Oracle transition. Change can be quite frightening to users.
OBIEE Update Frequency: OBIEE was updated only three times daily emerged as a crucial insight as it impacted the timeliness and responsiveness of service level reporting, emphasizing the need for real-time or more frequent updates.
Correlation with Accuracy: Data suggested a potential correlation between warehouse inventory management systems and information databases. This linkage unveiled an opportunity to enhance data synchronization, potentially contributing to improved accuracy.
Qualitative Insights from Interviews:
Stakeholder Expectations: Interviews with stakeholders shed light on their desire for accurate and timely service level reporting. This qualitative input contextualized the urgency for real-time updates and the need for standardized reporting.
User Frustrations: Employee interviews unearthed their frustrations with deciphering multiple date formats, leading to confusion and misinterpretation. This qualitative understanding underscored the importance of data consistency in enhancing user experience.
Recommendations: Forging the Path to Precision
Designated Monitoring Department: A dedicated monitoring department responsible for verifying and overseeing service level numbers and adjustments would ensure that discrepancies are swiftly addressed.
Standardized Service Level Report: The creation of a single source of truth for creating reports.
Evangelizing Findings:
To truly foster change, it is important to be a storyteller who can weave a clear and coherent picture of the challenges, solutions, and potential impact into narratives that resonate with stakeholders.
Impact and Future Direction:
Enhanced Reporting Accuracy: The designated monitoring department and standardized reports collectively contribute to bolstering the accuracy of service level reporting.
Stakeholder Satisfaction: Accurate and consistent reporting instills stakeholder trust, leading to heightened satisfaction with the organization's communication and data transparency.
Data-Driven Decision Making: Reliable service level data empowers informed decision-making across organizational tiers, fostering strategic advancements.
Usability Metrics: Demonstrating Tangible Impact
Time on Task: Before implementing the recommendations, the average time spent on service level report tasks was 12 minutes. Post-implementation, this time reduced to an average of 7 minutes, marking a 42% improvement.
Effectiveness of Task: Task success rate, which previously stood at 75%, saw a significant surge to 92% following the integration of the recommendations.
User Satisfaction: User satisfaction scores, quantified through post-task surveys, rose from an average of 3.5 out of 5 to a remarkable 4.8 out of 5.
Error Rate: The error rate, which initially hovered at 8%, was reduced to 2% after the user-centric changes were implemented.