Learn how SMEs can use design science research (DSR) to turn complex business problems into testable solutions and make more evidence-based decisions in day-to-day management.

Design Science Research for SMEs: An Expert Guide to Turning Problems into Tested Solutions
Running a small or mid-sized company means making decisions under constraint:
limited resources, limited time, and very real risk if you get it wrong.
At the same time, many leaders and managers are surrounded by theories, models and frameworks that often feel disconnected from daily operations. The result is a familiar tension:
“I know there’s valuable knowledge out there – but I need something that actually helps my business this quarter.”
Design Science Research (DSR) offers a way to close this gap. It is a research approach that focuses on designing and evaluating artefacts – tools, processes, models, methods – that address real-world problems in a systematic and evidence-based way.
This article outlines how SMEs can use DSR principles to structure problem-solving and support more evidence-based management.
What Is Design Science Research – Without the Jargon?
In academic language, DSR is about creating and evaluating “artefacts” that contribute to both practical solutions and knowledge.
In SME language, we can simplify this:
Design Science Research is a disciplined way to design, test, and refine solutions to real business problems, using evidence from your own organisation.
Typical artefacts in an SME context might include:
- A redesigned sales qualification process
- A performance dashboard that integrates financial and operational data
- A new customer onboarding protocol
- A set of decision rules for pricing, scheduling or prioritisation
The emphasis is not on writing about problems, but on building and evaluating solutions.
Why SMEs Tend to Struggle With “Research”
Many SME leaders and managers recognise the value of research and data but encounter recurring obstacles:
- Abstraction – Concepts sound promising but are difficult to translate into concrete action.
- Time pressure – There is little capacity for initiatives that do not show short- to medium-term value.
- Scepticism – Previous method or tool implementations created “initiative fatigue” without visible impact.
- Fragmented knowledge – Insights exist in people’s heads, slide decks, and spreadsheets, but are not integrated.
DSR is relevant precisely because it is problem-driven and intervention-oriented. It does not ask SMEs to “apply theory” in the abstract; it asks them to co-design and test specific interventions that can be evaluated in their own context.
A DSR-Inspired Cycle for SME Problem-Solving
You do not need to implement a full academic DSR protocol to benefit from its logic. The core can be expressed as a repeatable cycle.
Step 1: Clarify the Problem in Operational Terms – With the Right People Involved
The quality of any solution is limited by the quality of the problem framing.
Clarifying the problem is not a solitary desk exercise; it is a structured conversation with the people who see the issue from different angles.
Start by bringing together:
- Front-line employees who work in the process every day
- Process owners or supervisors who understand constraints and trade-offs
- Where relevant, support functions (e.g. finance, IT, quality)
- Optionally, external perspectives (e.g. key customers, partners, or a neutral facilitator)
Use this combined expertise and experience to move from vague complaints to a concise, measurable problem statement.
Instead of:
“Our sales process is chaotic.”
You might arrive at:
“Over the last 6 months, 45–55% of qualified leads drop out between first meeting and proposal. Front-line sales staff report inconsistent qualification and unclear responsibilities for follow-up.”
A robust problem statement typically includes:
- A specific process or area (e.g. sales, project delivery, onboarding)
- Observed symptoms from different roles (e.g. delays, rework, low conversion, frequent escalations)
- Baseline data where available (e.g. conversion rates, lead times, error rates)
- Contextual factors identified by employees (e.g. seasonal peaks, system limitations, conflicting targets)
By deliberately drawing on internal expertise – and, where useful, external input – you reduce the risk of optimising the wrong thing.
Evidence-based management starts here: with a jointly constructed, shared understanding of the problem.
Step 2: Define What “Better” Looks Like
Before designing solutions, specify the criteria for success:
- What should be different in 3–6 months?
- Which indicators will you monitor?
- What minimum change would justify further investment?
Examples of metrics:
- Lead-to-proposal conversion rate
- Time from order to delivery
- Number of change requests per project
- First-time-right rate in production
This aligns the DSR mindset with a results-focused, data-aware way of managing.
Step 3: Design a Focused Artefact
Next, you design an artefact – a tangible intervention aimed at the defined problem. The guiding principle is: small enough to pilot, substantial enough to matter.
Examples:
- A decision-support spreadsheet for prioritising projects
- A template for project kick-off meetings
- A checklist for sales discovery calls
- A visual workflow that standardises handovers between departments
Here, DSR differs from ad hoc problem-solving:
- The artefact is explicitly specified (you can describe or document it).
- It is grounded in existing knowledge (internal experience + external research).
- It is designed with evaluation in mind (you know what to look for later).
Step 4: Implement a Deliberate Pilot
Rather than rolling out a new solution across the organisation, you run a controlled pilot:
- Limit the scope (e.g. one team, one product line, one region).
- Set a defined timeframe (e.g. 4–6 weeks).
- Ensure participants understand the objective and the measures.
The key is to treat the pilot as an experiment, not as a final roll-out:
- You expect to adjust the artefact.
- You are explicit about what will be observed.
- You document the context (e.g. seasonality, staffing changes, external events).
Step 5: Collect Evidence – Quantitative and Qualitative
DSR emphasises evaluation. For SMEs, this typically means combining:
- Quantitative data
- Before/after performance metrics
- Counts (e.g. errors, delays, complaints)
- Time or cost measures
- Qualitative insights
- Feedback from employees using the artefact
- Observations about where the solution fits or clashes with reality
- Perceived benefits and burdens
The objective is not to run a perfect statistical study, but to gather sufficient, credible evidence to inform a managerial decision.
Step 6: Refine, Decide, and Capture the Learning
Based on the evidence, you can:
- Adopt and scale the artefact (with minor adjustments)
- Modify and re-test in a further pilot
- Or reject the artefact and redirect effort
From a DSR perspective, this step is crucial: you extract design principles and contextual insights, not just a yes/no decision.
For example:
- “For complex B2B projects, a standardised kick-off improves alignment, but only if the client is actively involved.”
- “A visual workflow reduces handover errors, but only when integrated into existing tools, not as an extra document.”
These principles become reusable knowledge for future initiatives and support a more consistent, evidence-based management style.
Why This Approach Fits Managers Who Want Evidence-Based Decisions
Managers who take decisions seriously are looking for more than opinions or trends. They need a way to:
- Connect decisions to real problems
DSR starts with a clearly formulated organisational problem, not with a generic solution. - Make assumptions explicit
Designing an artefact forces teams to articulate how they believe value will be created. - Test decisions in practice
Pilots and evaluation cycles provide evidence from your own context, not just borrowed benchmarks. - Build organisational learning, not just fixes
By capturing design principles and insights, each experiment contributes to a growing base of internal know-how.
For managers, this turns DSR from an abstract research method into a practical discipline for structuring decision-making and de-risking change.
Practical Considerations for SMEs Starting with DSR
For SMEs, a DSR-inspired initiative should respect real constraints:
- Time
- Aim for pilots that can be designed and launched in 4–6 weeks, not quarters.
- Capacity
- Use existing data where possible; avoid building complex measurement systems initially.
- Risk
- Choose pilots where failure is informative but not catastrophic.
- Communication
- Frame the work as practical improvement, not “a research project”.
A useful starting point is to identify:
- One high-impact, recurring problem
- One unit or team that is open to experimentation
- One artefact type that is realistic (template, checklist, dashboard, protocol)
Common Pitfalls – and How to Avoid Them
Even with a structured approach, several pitfalls are typical:
- Problem definition too broad
- Symptom: “We want to improve our whole innovation process.”
- Alternative: Narrow to one stage, one product line, or one customer segment.
- Artefact too complex
- Symptom: The solution requires new software, major training, or multiple approvals.
- Alternative: Start with a simpler version (e.g. spreadsheet, manual checklist).
- Insufficient evaluation
- Symptom: Changes are made, but no baseline or comparison data exists.
- Alternative: Decide in advance which 2–3 indicators you will track.
- No captured learning
- Symptom: People remember that “we tried something”, but not what was learned.
- Alternative: Document key findings and design principles in a short, structured format.
Anticipating these pitfalls is part of building an evidence-based management culture: one that values structured experimentation over ad hoc fixes.
Next Steps – From Concept to First Pilot
If you are an SME leader or manager considering a DSR-based approach, a practical sequence might look like this:
- Select a problem that is strategically relevant and observable.
- Map existing knowledge – internal experience and any relevant external research or benchmarks.
- Draft 1–2 artefact concepts that could realistically be piloted.
- Plan a short pilot with clear boundaries and metrics.
- Schedule a review point to interpret evidence and decide on next actions.
From there, you can gradually build a portfolio of tested artefacts and design principles that strengthen both your organisation’s performance and your decision-making capability.
Closing & Soft Call to Action
Design Science Research does not require you to become a full-time researcher. It requires you to:
- Treat key business challenges as structured design–test–learn cycles
- Make your solutions explicit, testable, and evidence-informed
- Capture and share what you learn, so it becomes organisational knowledge, not just personal experience
If you would like a structured conversation about how to use a DSR-style approach to support more evidence-based decisions in your organisation, you can:
- Interested in a short call to see if we can help? contact@nullpointventures.com
- Share a brief outline of your context (industry, role, key problem), and we can identify potential artefact and pilot options.

Leave a comment