top of page

Greenworks operates at the intersection of

high-consideration commerce, hardware complexity,

and lifecycle ownership. Customers must navigate tool compatibility, battery systems, property-specific needs, and post-purchase management, all within a fragmented ecosystem of experiences.
 

As the platform scaled, experience inconsistency, decision friction, and system-level gaps began to impact conversion, retention, and operational efficiency directly.

1000+

Patents Published

500+

Product Catalogue

4

Business Verticals

Key Case Studies

#1 Overview & Initiatives

Fragmented UX to Scalable Experience Strategy

Built the UX strategy and operating model that aligned teams, embedded user-centered thinking, and enabled scalable execution across web and app.

#3 Experience Overhaul

Designing Decisions,

Not Just Pages

Redesigned PDP, PLP and Dashboard experiences to clarify comparison and buying decisions, driving higher conversion, AOV, and lower bounce.

WIP

#2 Design Systems

UX Infrastructure

for Scale

Created a unified design system to standardize experience quality, accelerate design to dev workflows, and support experimentation at scale.

Group 1000004077.png

#4 AI: My Garage

Decision Support Through Personalization

Created a personalized, signal-driven recommendation flow that adapts to user context and behavior to improve engagement and relevance across product discovery.

Untitled design (3).png
WIP
Case Studies

Fragmented UX to Scalable Experience Strategy
How Greenworks’ Design System Enabled Platform Consistency and Speed

Tools

Figma
JIRA
Google Docs
Adobe Creative Cloud
GA4, Shopify Analytics

Role

Deliverables

Lead UX Designer
Experience Strategy 
UX Foundations

  • UX principles & decision frameworks

  • Experience architecture across discovery, purchase, ownership, and AI flows

  • UX operating practices  

    • quality reviews, alignment rituals, insight synthesis

  • Measurement model connecting UX decisions to business outcomes

Impact

↑~30%

task completion across priority journeys

↑~45%

design-to-dev handoff efficiency

6+

Coversion and retention focused initiatives

My Role

At Greenworks, I operated at a platform and systems level—owning UX foundations, decision frameworks, and cross-team alignment across multiple commerce, systems, and lifecycle surfaces.

 

I partnered closely with Product, Engineering, Data, and Marketing to:​​

Set Experience Direction

Defined UX principles and decision frameworks across the end-to-end customer lifecycle.

Establish Scalable UX Foundations

Built design systems and patterns to enable consistency, reuse, and faster execution.

Drive Insight-Led Decisions

Ensured research and behavioral data informed prioritization and design trade-offs.

Lead Platform-Level Initiatives

Owned high-impact initiatives across commerce, lifecycle, and AI-assisted experiences.

The Team​

Senior VP of Sales 

Sr. Director of Digital Channels & E-Commerce

IT E-Commerce Manager 

Digital Marketing Coordinator

Director of Category Management

Senior Content Leader 

Tech Lead

Digital Marketing Manager

Senior Lead UX Designer

That's me!

What I Heard?

As I worked closely with different teams, similar frustrations surfaced—expressed differently,

but rooted in the same gaps.

🤔   How do we know what the user wants?

😕   We don’t know which designs are  final vs exploratory.

😤   We need to move faster, but consistency keeps breaking.

🤑   The only thing that matters is - Profits!

😳   We have huge inventory on the website?

😩  We’re solving the same problems again and again.

😖   There’s no shared standard or decision framework.

😵‍💫   We know we need to work on UX, but where do we start???

😐   Do we even need UX? It gives slow delivery.

😓   It’s hard to tie UX work to business impact.

What I Realized

Teams wanted clarity:

The challenge wasn’t resistance to UX.

It was the absence of structure.

When should

UX be involved?

How do we make decisions consistently?

UX needed to move from

helpful but optional to

predictable and trusted.

What does “good UX” mean here?

How do we know we’re improving?

Goals: Key KPI's

Conversion Rate

discovery

purchase

post-purchase journeys

Retention & Engagement

Platform velocity & scalability

stronger account ownership

lifecycle experiences

shared systems patterns

reusable components

Friction points across path

unifying flows and reducing cross-platform inconsistencies

Time to decision

faster discovery,

clearer decision-making, and streamlined actions

Support dependency

improving self-service flows and ownership tools

The Shift

Instead of asking,

“What should I design?”

I started asking,

“What system would make a better design inevitable?”

Building the Foundation

UX Culture & Education

Early on, I noticed that most UX friction wasn’t about disagreement—it was about misunderstanding.

  • Engineers weren’t sure why certain UX decisions mattered

  • Stakeholders interpreted UX feedback as subjective

  • Designers had to re-explain the rationale repeatedly

This told me the issue wasn’t skill—it was shared context.

I realized UX needed to function as a shared language, not a specialized function.

 

This emerged to:

  • Create common principles that teams could reference

  • Make design rationale explicit and reusable

  • Build trust by demystifying UX decisions

UX couldn’t scale without shared understanding.

Business Alignment

In a fast-moving environment, decisions were often made based on:

​​​​

  • Past experience

  • Strong opinions

  • Time pressure

Even strong UX ideas stalled when they weren’t clearly tied to business outcomes.

​I began reframing UX conversations away from features and toward: Goals, Trade-offs, Impact

Ensuring that:

  • UX decisions mapped to business priorities

  • Stakeholders saw design as a strategic lever

  • Success could be discussed in terms of leadership              that cared about

UX needed to speak the language of the business to earn influence.

Research-Driven Decisions

I repeatedly heard variations of:

​​

  • “How does this help the business?”

  • “Is this worth prioritizing right now?”

 

This created inconsistency and repeated debates.

Instead of positioning research as a “phase,”

I treated it as decision support.

This supports to:

  • Reduce opinion-led back-and-forth

  • Ground discussions in observable behavior

  • Create confidence when trade-offs were necessary

Research wasn’t about validation—it was about clarity.

Cross-Channel Ideation

I noticed teams were solving problems in isolation:

​​​​

  • Web decisions didn’t consider app implications

  • Purchase experiences didn’t connect to ownership

  • Lifecycle touchpoints were treated as afterthoughts

This fragmented the user experience.

I began stepping back and asking: 

  • “Where does this experience start?”

  • “Where does it actually end?”

This helped:

  • Break channel silos

  • Encourage end-to-end journey thinking

  • Ensure experiences felt cohesive, not stitched together

Users don’t experience products in channels — they experience journeys.

Roadmap & Scale

Even when alignment existed, I saw good ideas fail because:

​​​​

  • Everything felt equally important

  • Teams lacked sequencing

  • UX work reacted to roadmaps instead of shaping them

I started focusing less on what we should design and more on when and why.

This helped to:

  • Size opportunities realistically

  • Balance quick wins with foundational work

  • Ensure UX decisions aged well as the platform grew

Good UX at scale is as much about timing as it is about quality.

Strategic Pillars

Business 

Alignment

Goals → Experience strategy

Stakeholder vision

KPI focus

UX Culture & Education

Shared principles

Design rationale

Cross-functional trust

Research driven Decisions

Audits

Behavioral insights User feedback loops

Roadmap & Scale

Opportunity sizing Sequencing

Platform scalability

Cross-channel

Ideation

Web + App + Lifecycle

End-to-end journeys

So collaboration felt

stable instead of disruptive.

To reduce

assumption-led execution.

So teams had a common language and reference

point.

So design decisions were grounded in impact, not preference.

Making Structure Real: The UX Operating Rhythm

Weekly

SUNDAY

MONDAY

TUESDAY

WEDNESDAY

THURSDAY

FRIDAY

SATURDAY

Focused design reviews using shared criteria

Design Documentation for exploratory vs final

Daily Huddle with Developers

Monthly

SUNDAY

MONDAY

TUESDAY

WEDNESDAY

THURSDAY

FRIDAY

SATURDAY

Week1

Week2

Week3

Week4

Experience Quality & System Health Reviews
A recurring review of shipped and in-progress work to assess experience clarity, consistency, and adherence to shared UX principles. Patterns, inconsistencies, and one-off solutions are identified early to inform design system updates and prevent downstream rework.

Cross-Functional Experience Alignment
Structured check-ins with Design, Product, and Engineering to align on priorities, trade-offs, and upcoming decisions across discovery, purchase, and ownership journeys. These conversations ensure UX intent remains intact while balancing technical and business constraints.

Insight-to-Impact Reviews
Regular synthesis of behavioral insights, audits, and feedback to connect UX decisions directly to business outcomes. These reviews surface where experience improvements influence conversion, retention, self-service efficiency, and roadmap direction.

Where I Faced the Most Friction

01

Balancing short-term delivery pressure with

long-term system thinking

02

Gaining alignment without formal authority

03

Proving value before metrics were mature​

04

Introducing structure without slowing teams down

I addressed these by:

​Anchoring conversations around outcomes, not artifacts

Instead of reviewing screens in isolation, I framed discussions around what problem we were solving and how success would be measured.

For example, PLP and PDP reviews focused on reducing decision friction and improving product clarity, rather

than visual polish or layout preferences.

Embedding system work inside active projects

Rather than treating the design system as a parallel initiative, I built and validated components while working on high-impact flows like PLP, PDP,  and Account.

This allowed the system to evolve from real usage and ensured immediate adoption without slowing delivery.

Prioritizing adoption over perfection

I intentionally shipped a smaller, opinionated set of components that teams could trust and reuse, instead of waiting to build a “complete” system. Feedback from designers and engineers guided incremental improvements, making the system feel practical

rather than aspirational.

Measuring progress through behavioral signals, not vanity metrics

In the absence of mature UX metrics, I tracked indicators like reduced clarification cycles, increased component reuse, earlier UX involvement in planning, and fewer repeated debates—signals that UX was becoming more predictable and trusted.

Measuring Progress

Fewer repeated UX debates

“We’ve already aligned on this pattern — let’s use the system instead of reopening the discussion.”

Increased reuse of patterns and decisions

“We used the same component from PLP in Account — it worked without changes.”

Teams proactively asking how to apply UX

“Which UX principle should guide this decision?”

“Is there a recommended pattern we should

start from?”

Faster design and engineering alignment

“The specs and patterns are clear enough for us to build this directly”

Clearer linkage between UX work

and business priorities

"This aligns with the goal we’re trying to move — improving conversion without adding complexity"

bottom of page